Your browser cookies must be enabled in order to apply for this job. Please contact if you need further instruction on how to do that.

Data Engineer

Engineering | Foster City, CA | Full Time

Job Description

Hybrid (noun) hy•brid \ˈhī-brəd\

Something that is formed by combining two or more things.

We believe that fewer words define Techshed more accurately than hybrid. A start-up environment powered by the stability of The Home Depot has helped us create a company populated by a team of talented players who work endlessly to solve real problems for real people, every single day. Our cutting-edge products are the result of lofty goals, deep research, healthy debates, boundless innovation, and most importantly, cross-functional collaboration. Everything we create, we create as a team.

In fact, our team is our company; they bring passion to everything they do. They’re friendly, outgoing, focused, and are some of the best and brightest that Silicon Valley has to offer. So come join the Techshed team and see what it’s like to create real products that actually improve our customers’ lives.

Our Perks

  • Shuttle Options
  • Paid Clipper Pass
  • Toll Reimbursement
  • 401K & ESPP
  • Medical, Vision, & Dental
  • Bicycle Friendly
  • Free Lunch
  • Tuition Reimbursement
  • Gym Reimbursement


About Us

Techshed, a Home Depot company, creates and develops interactive, user-friendly products that help improve how homeowners find and connect with qualified home service professionals in their neighborhood. Our product offerings currently include Pro Referral, Service Connect, as well as The Home Depot’s Home Services.

Core Responsibilities

  • Data Modeling: Extend existing and create new data models as are required to support the company and provided solutions.
  • Database Development:  Design and develop database stored procedures, triggers, and functions
  • Database Standards Creation and Enforcement: Enforcement of the company’s internal database standards.  Trainings and code reviews to ensure adherence to standards and overall quality.
  • Data Integration: Design and implementation of data pipelines allowing for near real-time data consumption

Core Requirements

  • Strong expertise with database technology (5+ years)
  • Strong performance tuning skills
  • Experience with PostgreSQL (9+) and MySQL (5.5+)
  • Experience with columnar data stores such as Vertica, Redshift or Infobright
  • Experience with Google Big Query
  • An ability to write complex and efficient SQL statements and stored procedures
  • 5+ years production experience building batch data pipelines: data management, ETL processes, data mapping, data validation, testing, and documenting data definitions
  • Working knowledge of at least one of the following:  Python, Shell scripting
  • Experience developing against 3rd party API
  • Bachelor’s Degree in CS/EE or equivalent experience
  • Strong communication skills, enabling transparency and coordination across multiple business and service units
  • Experience collaborating with distributed teams
  • Experience with Git

Desired Requirements (nice to have)

  • Working knowledge of Java, Scala or Go
  • Experience with provisioning and configuration frameworks: Ansible, Salt
  • Experience with Apache Spark, Kafka, and Hadoop
  • Experience with one of the following cloud computing providers: Google Compute Engine, Amazon AWS or Azure
  • Experience with Retail