Software Engineer - Data Platform
Role details
Job location
Tech stack
Job description
As a foundational software engineer on our data platform engineering team, you will lead the charge on a variety of projects writing software to aggregate, store, and make sense of data.
Examples of possible work include everything from building data warehousing for ERP and machine data, building software agents to get data off of machines, and building certified data sets to enable operations and business intelligence use of data.
You will be challenged to think creatively and solve complex data integration problems. You will work cross functionally with production experts, software engineers, and machining specialists to develop novel solutions working toward fully automated factories.
What You'll Do
-
Scope, architect, implement, and deploy critical applications that will drive revenue and make a positive impact in the world.
-
Build and manage a robust data warehouse and write software to coordinate and deploy data pipelines.
-
Conceptualize and own the data architecture for multiple large-scale projects.
-
Create and contribute to data frameworks that span on-premises and cloud infrastructure improve the efficacy of logging machine data, while working with data infrastructure to triage issues and resolve.
-
Solve our most challenging machine data integration problems, utilizing optimal ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources.
-
Collaborate with machine engineers, product managers, and data scientists to understand data needs, representing key data insights visually in a meaningful way.
-
Get to build alongside an incredible team of software engineers, mechanical engineers, operators, and the best machinists/CAM programmers in the world.
Requirements
-
Have extensive experience shipping modern, data-centric applications (our data systems use Argo-Workflows, Dagster, Superset, Aurora, RDS, S3, and back-ends are Go and Python, with gRPC/Avro and Kafka as our messaging platform).
-
Have experience with IaC and GitOps tooling (We use Terraform extensively and have centralized on Kubernetes/Argo/Helm).
-
Extremely well versed with data querying techniques across NoSQL and SQL platforms.
-
Have a Bachelor's degree in Computer Science and/or equivalent experience.
-
Solid understanding in building data architecture and pipelines.
-
Are self-motivated and eager to get hands-on and tackle challenges independently while working collaboratively toward identified objectives.
-
Work with a platform mentality -- driven to find the right architecture and plan up front and solve problems with the long term in mind.
-
Take responsibility and ownership finding solutions no matter what.
-
Deploy your broad experience and big picture view to fix undreamed of problems with innovative solutions.
-
Feel passionate about making things move in the real world with software.
-
Are excited to work in a fast-paced environment with high-stakes and quick iteration cycles.
-
Are a highly effective communicator when speaking or writing, especially when presenting technical information., To conform to U.S. Government space technology export regulations, including the International Traffic in Arms Regulations (ITAR) you must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State. Learn more about the ITAR here (https://www.pmddtc.state.gov/?id=ddtc_kb_article_page&sys_id=24d528fddbfc930044f9ff621f961987) .
Benefits & conditions
For this role, the target salary range is $120,000 - $200,000 (actual range may vary based on experience)., + Medical, dental, vision, and life insurance plans for employees