Remote pandas Jobs

This Month

Python Data Engineer
python mysql mongodb docker pandas javascript Sep 05

Permanent full time role in a team of 6 python developers. Fully remote, flexible schedule and autonomous self management. Immediate start available.

What you will work on

BitEdge odds comparisons and related features. Use web scraping and APIs to gather sport events and odds data. Process and persist the data to a database. System admin the VPS on which the system runs.

Required skills and experience

  • Python.
  • Celery task scheduling.
  • Scrapy web scraping.
  • Pandas data processing.
  • MySQL and MongoDB.
  • Linux system administration.
  • System architecture.
  • Docker.
  • Git version control.
  • Live in timezone range from UTC+0 to UTC+10:00.

Optional skills and experiences

  • Sports, betting and crypto
  • Front end JavaScript and responsive HTML/CSS.
  • PHP.
  • WordPress.
  • Email marketing with subscription, email templates and sending.

Pay and conditions

Time commitment is full time. You will work remotely on any days of the week and any time of the day you like. You will report to the CTO.

Share this job:

This Year

Python Developer
python kubernetes hl7-fhir pandas fastapi linux May 19

What we expect you to love:

  • Working on diverse and hard problems
  • Writing world class code and documentation
  • Creating solutions around user requirements
  • Challenge your own views and learn fast
  • Take responsibility and get things done

What we expect you to have:

  • A ‘can-do’ attitude and the ability to thrive in a fast-paced environment
  • Professional experience with state of the art, high level programming languages (preferably Python)
  • Experience with a set of common frameworks
  • In depth Linux and networking knowledge
  • Experience in working on a software project in a team
  • Experience with test driven development
  • Beneficial: dev-ops experience

What we offer:

  • We don’t care about your education, as long as you love what you’re doing
  • A crucial role in a highly motivated and focused team
  • Flexible working arrangements
  • Office in the city center of Graz (Free coffee and snacks, adjustable desks, headphones etc.)
Share this job:
Senior Data Analyst
python sql pandas data-analysis data-visualization senior Jan 16

Millions of people experience real-life adventures with our apps. We help people all over the world discover the best hiking and biking routes, empowering our users to explore more of the great outdoors. And we’re good at it: Google and Apple have listed us as one of their Apps of the Year numerous times—and, with more than 9.5 million users and 50,000 five-star reviews - komoot is on its way to become one of the most popular cycling and hiking apps. 

Join our fully remote team of 60+ people and change the way people explore!

We are now looking for an experienced and curious data analyst to help drive product decisions and help us develop the company strategy.  We believe that data-informed decision making is key to our success and your skills, curiosity, and experience will play a crucial role in building the future of outdoor experiences. You can work from anywhere within the UTC-1 to UTC+3 timezone.

What you will do

  • Turn data into actionable insights
  • Develop analysis to drive product and business decisions and company strategy
  • Design and implement metrics, dashboards, and continuous reports
  • Communicate and discuss  findings to different audiences (Co-founders, marketing, product, sales, etc.)
  • Design and lead bigger analytics project to answer komoot’s key questions
  • Organize and prioritize tasks of our data analytics roadmap
  • Be an evangelist of data-informed decision-making

Why you will love it

  • You’ll be a key player in making product development, marketing, and business decisions
  • You’ll work closely with komoot’s co-founder and influence the future direction of komoot
  • You’ll influence and be responsible for the future development of analytics at komoot, technically an organizationally
  • You’ll work in a fast-paced startup with strongly motivated and talented co-workers
  • You’ll enjoy the freedom to organize yourself the way you want 
  • We let you work from wherever you want, be it a beach, the mountains, your house, co - working of your choice, our HQ in Berlin/Potsdam or anywhere else that lies in any time zone situated between UTC-1 and UTC+3
  • You’ll travel together with our team to amazing outdoor places several times a year to exchange ideas, learnings and go for hikes and rides

You will be successful in this position if you

  • Have a burning desire to transform data into actionable insights
  • Have 3+ years of experience in evaluating marketing campaigns, cohort analysis, A/B testing and retention
  • Fluent in SQL and Python’s data analytics libraries (pandas, numpy, matplotlib)
  • Have strong communication and team skills
  • Familiarity with statistical concepts
  • Have a good overview of technical solutions in the data analytics field
  • Have a hands-on attitude and are highly self-driven

Sounds like you?

Then send us the following:

  • Your CV in English highlighting your relevant experience
  • A write-up explaining who you are and why you are interested in working at komoot
  • Examples of your work (e.g. Dashboards, PDFs, Slideshare, etc.)
  • Feel free to send us something that shows us a little more about what you’re interested in, be it your Twitter/Instagram account
Share this job:
Data Engineer
python sql google-bigquery pandas airflow data science Jan 06

Position Overview:

The ideal candidate is an experienced data engineer. You will help us develop and maintain our data pipelines, built with Python, Standard SQL, pandas, and Airflow within Google Cloud Platform. We are in a transitional phase of refactoring our legacy Python data transformation scripts into iterable Airflow DAGs and developing CI/CD processes around these data transformations. If that sounds exciting to you, you’ll love this job. You will be expected to build scalable data ingress and egress pipelines across data storage products, deploy new ETL pipelines and diagnose, troubleshoot and improve existing data architecture. Working in a fast-paced, flexible, start-up environment; we welcome your adaptability, curiosity, passion, grit, and creativity to contribute to our cutting-edge research of this growing, fascinating industry. 

Key Responsibilities:

  • Build and maintain ETL processes with our stack: Airflow, Standard SQL, pandas, spaCy, and Google Cloud. 
  • Write efficient, scalable code to munge, clean, and derive intelligence from our dataPage Break

Qualifications & Skills: 


  • 1-3 years experience in a data-oriented Python role, including use of:
    • Google Cloud Platform (GCE, GBQ, Cloud Composer, GKE)
    • Airflow
    • CI/CD like: GitHub Actions or CircleCI 
    • Docker
  • Fluency in the core tenants of the Python data science stack: SQL, pandas, scikit-learn, etc. 
  • Familiarity with modern NLP systems and processes, ideally spaCy


  • Demonstrated ability to collaborate effectively with non-technical stakeholders
  • Experience scaling data processes with Kubernetes 
  • Experience with survey and/or social media data
  • Experience preparing data for one or more interactive data visualization tools like PowerBI or Tableau


  • Choose your own laptop
  • Health Insurance
  • 401K
Share this job: