Data Engineer
Data Analysis | Remote in Atlanta, GA | Contract and Temporary | From $70.00 to $80.00 per hour
Job Description
Data Engineer
Leading streaming company is looking for a Data Engineer to join the Data Product and Analytics team. The ideal candidate will develop and automate data transformation pipelines integrating data sources from various parts of the business to drive listener understanding and operational efficiency. As a Data Engineer, you will collaborate with departments across the organization to build data transformation pipelines, your responsibilities will include supporting the development, maintenance, and operational stability of data engineering and analytics infrastructure.
Data Engineer Pay and Benefits:
- Hourly pay: $70-$80
- Worksite: Leading streaming company (Remote, Candidates must be located in the United States- Central and Eastern Time Zones only)
- W2 Employment, Group Medical, Dental, Vision, Life, Retirement Savings Program
- 40 hours/week, 3 month assignment to start
Data Engineer Responsibilities:
- Architect, build and launch scalable data pipelines that support marketing growth efforts
- Build integrations and ETL processes with 3rd party data to automatically pull data from external sources into operational and analytical data pipelines using a combination of Python and SQL
- Use SQL and Python to write production-quality code to meet the data transformation needs of analysts, data scientists, and other business partners
- Streamline and automate tasks as well as innovate by building reusable components that can be used across multiple use cases and by the marketing analytics team in Google Cloud Platform
- Visualize, communicate and present statistical model outputs and analysis to various technical and non-technical stakeholders across the company
- Develop automated and ad hoc reports, dashboards, and models to help guide business decision
Data Engineer Qualifications:
- Experience with Airflow.
- Experience with Linux concepts (e.g. Cron, scripting).
- Experience with Python.
- Experience with R.
- Experience with Google Cloud Platform,AWS or any other cloud-based platform.
- Experience with Apache Spark.
- Experience developing scalable ETL processes using SQL and Python.
- Proficient in SQL, ideally with Hive, Presto and or BigQuery syntaxes.
- Proficient in Python.
- Knowledge of data warehouse architecture and data modeling.
- Knowledge of statistical modeling concepts and techniques.
- Excellent time management and attention to detail.
- Excellent written and oral communication skills.
- Experience with creating and maintaining production dashboards in Tableau or Zeppelin.
- Experience working with cross-functional teams in a fast-paced environment.
- Quantitative marketing analytics experience.
- Understanding of basic A/B testing concepts.
- Familiarity with statistical regression techniques and applications.