Engineering | San Francisco, CA | Full Time
Taulia is a fintech provider of working capital management solutions headquartered in San Francisco, California. Taulia helps companies access value tied up in their payables, receivables and inventory. A network of more than 2 million businesses use Taulia’s platform to determine when they want to pay and be paid. Taulia processes more than $500 billion each year and is trusted by the world’s largest companies including Airbus, AstraZeneca, Nissan and Vodafone. For more information, please visit www.taulia.com.
About the Job:
Taulia is looking for an experienced Data Engineer who has a passion for building data products and systems. Our ideal candidate is bright, responsible, self-motivated, confident, and gets stuff done. We look for problem solvers who have the skill and experience to take large datasets and build the tools, processes, and systems to translate that data into actionable business decisions. Must be willing to do whatever it takes to get the job done while ensuring that we’re building a scalable system for the future of Taulia and our customers. We’re looking for someone who prides themselves on anticipating problems, looking beyond immediate issues and taking the initiative to improve both our software and our development infrastructure.
In return you’ll get the reward of working inside a passionate, hungry, and FUN team. You’ll get the joy of bringing a new product to life from concept to delivery, and the technical and business learnings of working with data and users at scale.
Essential Duties and Responsibilities:
Help build and maintain a streaming infrastructure (AWS Kinesis & Lambda) to translate data from transactional product systems into a scalable DataMart architected for efficient reporting (Aurora). This system must keep up with over 1M messages per day.
Help build and maintain processes to automatically supplement each streamed data piece with third party data from numerous APIs and sources (Salesforce, CapIQ, Social123, etc). Making sure our DataMart is always up to date and a true company data hub.
Automate processes to transform datasets into efficient structure for reporting, and ensure that data is up to date in reporting tools (Salesforce, Analytics tools, etc).
Maintain all components of data pipeline, from coding to deployments to monitoring.
Design, build and support pipelines of data transformation, conversion, and validation.
Design and support effective storage and retrieval of large datasets (180 million Invoices, Purchase Orders, etc.).
Design and implement best practices for cloud based cluster deployments.
4+ years experience in a Data Engineering oriented role.
Experience with AWS - Kinesis, RDS, Redshift, EC2, VPCs, S3, EBS Volumes, etc.
Experience with Python, Node.js.
Experience with MySQL.
BA/BS in Computer Science or related field.
Nice to have:
Experience with Kinesis/Lambda - Streaming messages, building consuming lambda functions.
Experience with Redshift/Aurora - maintaining clusters, designing table structures, interacting with S3 and moving data.
Experience integrating with APIs, particularly Salesforce, Box, AWS, etc.
GCP:- Looker, Big Query and Google Kubernetes Engine (GKE).
Taulia is an Equal Opportunity Employer --Minority / Women / Disability / Veteran / Gender Identity / Sexual Orientation / Age.
If you don't think you meet all of the criteria above but still are interested in the job, please apply. Nobody checks every box, and we're looking for someone excited to join the team.