We are looking for an experienced Data Engineer to join our growing software engineering organization.  The right person will help move our platform to the next level. You’ll be working as part of a skilled, collaborative team to jointly design and implement high visibility applications. This is an IDEAL job if you are an experienced data engineer who uses software engineering approaches to solve data problems.  You want to be part of a small, intensely skilled team, who feel total ownership of their work, and can’t imagine a day without learning & coding.   You will play a crucial role in the Xometry platform and everything you do will matter.

What You’ll Do
Assemble data sets that meet functional / non-functional business requirements.
Build analytics models that utilize the data pipeline to provide actionable insights into key business performance metrics.
Maintain data pipelines and perform any changes or alterations as requested.
Work with stakeholders including the Product Managers, Data and Design teams to assist with data-related technical issues and support their data needs.
Develop data models for analytics and data scientist team members that assist them in building and optimizing data.
Work with data and analytics experts to strive for greater functionality in our data systems.
Active participation on a software development team designing, coding, testing, and releasing functionality to our customers.
Close collaboration with other engineers and product managers to become a valued member of an autonomous, cross-functional team.
Operational responsibility for the services that are owned by your team, potentially including taking part in an on-call rotation.
Work in an environment that supports your individual growth.

What We’re Looking For
3+ years prior experience as a software / data engineer in a fast-paced, technical, problem-solving environment.
Knowledge and understanding of AWS data ecosystem like RDS, Data Migration Service, S3, Athena, Kinesis and others.
Cloud Data Warehouse experience – Redshift/Snowflake/Bigquery,
Working knowledge of Looker, LookML, DBT or similar data platform tools.
Design experience with Relational Model, Dimensional Model, 3NF, Data Vault.
Experience with iPaaS tools like Workato, Dell Boomi or Jitterbit.
Data Modeling best practices for transactional and analytical processing.
SQL, Querying JSON and XML.
Experience with Programming language like Python, JavaScript.
Experience working in and with DevOps / Site Reliability environment and teams.
Ability to set up CI/CD pipelines, maintain terraform scripts for infrastructure.
Ability to work with APIs – Familiarity with technologies such as REST or GraphQL.
Strive for low code/no code implementations.
Ability to design and propose solutions with a variety of data formats and data stores.
Working knowledge of message queuing, stream processing, and highly scalable data stores.
Understanding of web application architectures like micro-services, domain driven design, service-oriented architecture.
Must be a US citizen, green card holder, or a legal permanent resident of the United States.