About the job
We are a group of ambitious builders with a passion for simple and elegant solutions to complex problems.
Our founding team has decades of industrial experience of working with data in general and data quality in particular, and more than a decade of academic research with statistical and machine learning methods and software engineering. Being fed up with the lack of user-friendly, flexible, and scalable tools for efficient data quality management, they decided to build something themselves.
Validio is building the gold standard platform for data quality monitoring and validation at the center of the modern data stack. And now, you can be part of our journey!
We need a passionate Data Engineer to help us develop our product, ensuring that our customers get their high-quality data in real-time. Sounds cool? Read on - it gets better!
About the Role
We envision that, as a Data Engineer, you will focus on making sure that the Validio platform integrates with external tools (such as cloud providers, data orchestration tools, data sources, data warehouses, data ingestion tools, BI tools, DBT, etc.) in a proper manner. This includes for example the prioritization among integrations to external tools as well as how the integration should look like. To thrive in this role, we believe that it is important for you to be experienced with many of these tools as well as to be curious and eager to learn how others are using them (for example by participating in customer calls and asking around in your network of data engineers).
We are a small and pragmatic team that works closely together, and we’re growing quickly. This means two things:
1. The role description is not set in stone.
2. There is a lot of opportunities to grow with the company.
As part of a well-funded early-stage startup you will be a core part of our small but rapidly growing team, building the foundations of our platform - and being part of the journey of a lifetime!
You are a driven and motivated problem solver ready to pursue meaningful work. You strive to make an impact every day. Sounds familiar? Awesome!
We believe you are/have:
- Experience using GCP and AWS.
- Experience using SQL and it’s specific dialects (e.g. Amazon Redshift and Google BigQuery).
- Experience with data orchestration tools like Airflow and Prefect.
- Experience with version control systems like GitHub and GitLab.
- Experience with Kafka or other streaming data sources.
- Experience with data warehouses like Snowflake, Amazon Redshift and Google BigQuery.
- Experience with ingestion tools like Fivetran and Airbyte.
- Experience with BI tools like Looker and Tableau.
- Experience with DBT.
- A passion for learning new technologies.
Do you want to be a part of our journey? Please apply by clicking the link below!
We believe that bringing people together from different backgrounds, experiences and perspectives makes for a healthy workplace, a more successful business and a better world. We value diversity and encourage everyone to come and join us on our mission to eliminate bad data 🚀