Remote google-bigquery Jobs

This Month

Paid Research Study for Data Professionals
apache-spark data-warehouse google-bigquery amazon-redshift Jul 27

User Research International is a research company based out of Redmond, Washington. Working with some of the biggest companies in the industry, we aim to improve your experience via paid research studies. Whether it be the latest video game or productivity tools, we value your feedback and experience. We are currently conducting a research study called . We are looking for currently employed Data Professionals who have experience with cloud-based data warehouses. This study is a one-time Remote Study via an online meeting. We’re offering $150 for participation in this study. Session lengths are 90 minutes. These studies provide a platform for our researchers to receive feedback for an existing or upcoming products or software. We have included the survey link for the study below. Taking the survey will help determine if you fit the profile requirements. Completing this survey does not guarantee you will be selected to participate.  If it's a match, we'll reach out with a formal confirmation and any additional details you may need.

I have summarized the study details below. In order to be considered, you must take the survey below. Thank you!

Study: Cloud-Based Data Study

Gratuity: $150

Session Length: 90 mins

Location: Remote

Dates: Available dates are located within the survey

Survey: Cloud-Based Data Study Sign-Up

Share this job:

This Year

Senior Node.js Developer
node-js aws mongodb google-bigquery firebase senior Apr 13

This is a 100% remote role. You will need to meet with the development team during European hours, so if you’re in Europe or Asia then it will be easier for you to work together with the team. Some flexibility of hours but you need at least 4 hours per day to work with the rest of the team who are in Europe and Asia. 

You must have : 

  • 5+ years of back end experience. 
  • Significant experience with Node.js and MongoDB
  • Experience in dealing with a Node.js application at scale
  • Experience with test-driven development

It’s a plus factor if you know:

  • MongoDB sharding
  • AWS S3, AWS SQS, AWS RedShift
  • Google BigQuery
  • Firebase
  • Redis
  • Heroku
  • AWS Lambda

This is a 100%-remote work/node.js/node back end developer/ telecommute / work-from-home /virtual position.

How to apply: 

Please go to this link - and complete all the initial requirements for your application. 

Share this job:
Software Engineer
ruby-on-rails postgresql node-js go google-bigquery saas Jan 16

We are:

Shogun (YC W18): a page builder platform for eCommerce stores. We're one of the most popular apps on Shopify and BigCommerce. Our fully remote team of 50 is located all around the world.

We need:

A full stack rails engineer with an interest in the business side of things. You'll work closely with our growth team to move growth initiatives forwards.


  • Work on special projects to drive growth at Shogun
  • Build out and maintain internal software (admin, affiliate tracking system, etc.)
  • Build out new platform and partner software integrations
  • Build out integrations between software services we use for business operations (CRM systems, campaign management software, etc.)
  • Build out and maintain our APIs for our technology partner program
  • Write queries and work with analytics tools to understand the business

You must:

  • Know or be interested in knowing what drives a startup SaaS business (MRR, LTV, Churn, CAC, ARPU, Conversion Rate, Acquisition Funnels, etc.)
  • Know rails like the back of your hand
  • Enjoy scaling challenges as we're growing fast
  • Have a super high standard of quality
  • Be comfortable looking at data
  • Have at least a few years of experience with tech companies
  • Have built some cool things we can check out
  • Have a few references we can say hi to

Technologies we use:

  • rails
  • go
  • nodejs
  • postgres
  • mongodb
  • BigQuery
  • heroku

We offer:

  • Solid pay
  • Trips to international offsites
  • Health Benefits (US)
  • And more.
Share this job:
Data Engineer
python sql google-bigquery pandas airflow data science Jan 06

Position Overview:

The ideal candidate is an experienced data engineer. You will help us develop and maintain our data pipelines, built with Python, Standard SQL, pandas, and Airflow within Google Cloud Platform. We are in a transitional phase of refactoring our legacy Python data transformation scripts into iterable Airflow DAGs and developing CI/CD processes around these data transformations. If that sounds exciting to you, you’ll love this job. You will be expected to build scalable data ingress and egress pipelines across data storage products, deploy new ETL pipelines and diagnose, troubleshoot and improve existing data architecture. Working in a fast-paced, flexible, start-up environment; we welcome your adaptability, curiosity, passion, grit, and creativity to contribute to our cutting-edge research of this growing, fascinating industry. 

Key Responsibilities:

  • Build and maintain ETL processes with our stack: Airflow, Standard SQL, pandas, spaCy, and Google Cloud. 
  • Write efficient, scalable code to munge, clean, and derive intelligence from our dataPage Break

Qualifications & Skills: 


  • 1-3 years experience in a data-oriented Python role, including use of:
    • Google Cloud Platform (GCE, GBQ, Cloud Composer, GKE)
    • Airflow
    • CI/CD like: GitHub Actions or CircleCI 
    • Docker
  • Fluency in the core tenants of the Python data science stack: SQL, pandas, scikit-learn, etc. 
  • Familiarity with modern NLP systems and processes, ideally spaCy


  • Demonstrated ability to collaborate effectively with non-technical stakeholders
  • Experience scaling data processes with Kubernetes 
  • Experience with survey and/or social media data
  • Experience preparing data for one or more interactive data visualization tools like PowerBI or Tableau


  • Choose your own laptop
  • Health Insurance
  • 401K
Share this job: