330 day(s) ago

Data Engineer (100% Remote) vacancy at SimplyAnalytics

Negotiable salary

Remote worker location: Worldwide
English: Advanced, Upper Intermediate, Native Speaker
Experience: 5+ years
Employment: Full-time


United States, New York
Industry: B2B SaaS
SimplyAnalytics is a B2B SaaS mapping, visualization, and data analytics application that makes it easy for anyone to create interactive maps, charts, and reports using 100,000+ data variables. With SimplyAnalytics, users can identify target locations, map the competition, track how places change over time, and identify where to market products. We are passionate about creating outstanding software, and we believe in automated testing, continuous integration, and code review.

Description of the job

We're looking for a Data Engineer to manage our existing data workflows, develop and maintain new ETL pipelines, and conduct data related QA. You will be creating and maintaining production-quality in-house tools within a large shared code base, and the data you curate will be used by thousands of university students, researchers, and marketing professionals. 
The ideal candidate is a self-starter, has a high level of attention to detail, is comfortable asking questions, enjoys working with talented colleagues, and has an interest in analytics and data visualization. 
This is a 100% remote position, our developers can live and work anywhere . This is a full-time salaried position. 
Required Skills & Experience: 

  • Five years of professional software development work experience 
  • Expert relational database and data manipulation skills 
  • Experience with data orchestration platforms (Dagster, Airflow, or Prefect) 
  • Experience with development on large OOP software projects 
  • Thorough understanding of API design principles 
  • Ability to maintain our full data processing stack, primarily in Python but with legacy code in PHP 
  • Experience with Linux servers, including Linux command line and SSH 
  • Must write clean, production quality, well documented, maintainable code 

 Bonus Skills & Experience: 
  • Experience with Hadoop (Hive and/or Trino) 
  • Experience using AWS services for big data tasks

Similar Jobs