Intern, Data Engineering
Engineering | San Francisco, CA | Part Time
Taulia delivers working capital solutions that make it easy for businesses to free up cash, accelerate payments and improve supply chain health.
Since founding in 2009, we’ve envisioned a world where every business thrives by liberating cash. Today, our team of financial gamechangers have built a network connecting 1.5 million businesses across 168 countries and accelerated more than $80 billion in early payments.
Using our state-of-the-art platform, businesses now have the option to choose when and how to pay and get paid. It sounds simple. But our painless process provides both buyers and suppliers the chance to skyrocket their cash - cash to fuel economic growth all over the world. It’s win-win for everybody.
About the Job:
Taulia is seeking an intern to join our Data Engineering team for a 3 month duration. As an intern, you:
Have a passion for building data products and systems.
Have an ability to learn new technologies quickly with minimal support and guidance.
Collaborate with other Data engineers, Data scientists, managers, product leaders and other stakeholders to ensure successful delivery of projects.
Identify and solves problems that will help move the needle for our cash forecasting products.
Constantly evaluate and optimizes deployment of data models to achieve greater efficiency and reliability.
Is strong in SQL, Databases, Data modeling methodologies.
Have an exposure to scripting languages, preferably Python.
Are willing to build prototypes and/or reusable tools to answer research or business questions.
Current University student or recent graduate.
Must have a passion for data and be willing to help business turn data into information and action.
Experience or education related to data warehousing.
Experience or education related to business intelligence practices & tools.
Effective written and verbal communication skills.
Nice to Have:
Experience with data warehouses, data lakes, big data, ETL, reporting & analytics, Data Visualization tools.
Experience with Kinesis/Lambda (or similar streaming technologies) - Streaming messages, building lambda functions.
Experience with Redshift/Aurora (or similar database/ data lake) - maintaining clusters, designing table structures, interacting with S3 and moving data.
Experience integrating with APIs, particularly Salesforce, Box, AWS, etc.
Experience with containerized/serverless model deployments.
Experience with Data Visualization Tools like Sisense / Tableau or similar.
Advanced SQL Knowledge.