Data Engineer (remote)

Salary: $125-$150k plus 15-20% bonuses and benefits

We are looking for a remote Data Engineer to work on our growing, dynamic Engineering team. We are seeking someone with a technical background who enjoys solving complex problems and has professional experience owning ETL processes. The Data Engineer will be primarily responsible for building out new pipeline components and maintaining existing ones for a complex technology stack that spans a variety of languages and frameworks.

The Engineering team is responsible for data health and quality in every step of the pipeline process, from initial ingestion to deployment and visualization. As a result, debugging can require a deep dive into several interfacing pieces of software, and on any given day a Data Engineer can expect to work on multiple components that perform very different functions.

You will be tasked with the following:

  • Manage, modify, and maintain our proprietary software responsible for data storage and transformation of data from a wide variety of sources and delivery methods
  • Design and build new components that scale to efficiently ingest, normalize, and process data from a growing number of different sources
  • Run distributed computing jobs using Databricks/Spark to prepare and transform terabytes of time-series and event data for modeling
  • Integrate external APIs into current products and utilize their data to streamline and add value to current offerings
  • Assist DevOps with optimization of company infrastructure

Qualifications & Skills

  • 2+ years of experience using Python 3 to leverage its strong data science libraries, including Pandas and Spark/Databricks
  • Strong in at least one other language other than Python; experience shell scripting, especially Bash
  • Proficient with different flavors of SQL, especially PostgreSQL, including understanding of under-the-hood concepts like indexing and analysis of query plans
  • Experience with automation of DevOps processes in cloud environment
  • Experience extracting data from, and pushing data to, a variety of sources including relational and non-relational databases, RESTful APIs, flat files, FTP servers, and distributed file systems
  • Experience with Agile / Scrum development methodologies
  • Experience with “XaaS” cloud services — we are an AWS shop but will consider candidates with similar experience on other cloud platforms
  • Excellent communication skills, both written and oral, especially when explaining difficult technical concepts to people in non-technical roles
  • Strong analytical skills, especially when working with multiple large datasets

ETL Developer

We are hiring for a remote ETL Developer working with a government organization. This role will require a MBI clearance or Public Trust. This could be either a full time, permanent position or a contract role with a salary of $90K or $45/hr on 1099.

Responsibilities:

  • Solid ETL experience with Informatica PowerCenter development and related components technologies (ETL Designer, etc.)
  • Experience in creating, maintaining, and optimizing stored procedures, functions, inline SQL, and ETL processes
  • Familiarity with Informatica performance tuning of sources, mappings, targets, and sessions
  • Familiarity with Software Development Life Cycle, specifically Agile, and able to independently participate in each phase
  • Experience working with Greenplum (PostgreSQL) and/or Oracle SQL/PLSQL
  • Previous experience in monitoring, supporting, and resolving issues for applications with technologies such as Informatica, Greenplum, etc.
  • Preferred: Previous government experience with active MBI clearance. Informatica developer certification.
  • 5+ Years of Experience
SQL Developer

We are hiring for a remote SQL Developer working with a government organization. This role will require a MBI clearance or Public Trust. This could be either a full time, permanent position or a contract role with a salary of $90K or $45/hr on 1099.

Responsibilities:

  • Experienced in PL/PGSQL, SQL code development, maintenance, and migration 
  • Experience in creating, maintaining, and optimizing stored procedures, functions, inline SQL, and ETL processes
  •  In depth knowledge of the PostgreSQL architecture and application data lifecycle management 
  • Knowledge of database design, query optimization, index management, integrity checks 
  • Familiarity with Software Development Life Cycle, specifically Agile, and able to independently participate in each phase
  • Previous experience in monitoring, supporting, and resolving issues for applications with Database technologies
  • Willing to work during Production maintenance hours & provide on call support
  • Preferred: experience in working with Greenplum (PostgreSQL) and Oracle SQL/PLSQL
  • Preferred: Previous government experience with active MBI clearance