What you will do
- Design, develop, and implement tooling and datasets that enable telemetry-driven cost attribution and performance-informed financial modeling;
- Build and maintain scalable data pipelines and automation frameworks to support cost transparency and optimization initiatives;
- Develop systems that surface cost-saving opportunities and support committed-use discount modeling and management;
- Ensure data accuracy and reliability through end-to-end validation, auditing, and observability practices;
- Collaborate cross-functionally with Product, Engineering, Finance, and Analytics stakeholders to deliver high-impact data solutions;
- Improve and maintain the long-term scalability, performance, and reliability of ELT/ETL pipelines;
- Support and enhance CI/CD processes and infrastructure automation.
Must haves
- 4+ years of experience in a Data Engineering role;
- Strong hands-on experience with SQL and Python;
- Experience with workflow orchestration tools such as dbt, Airflow, or Dagster;
- Demonstrated experience designing and maintaining end-to-end ELT/ETL data pipelines;
- Strong background in data modeling and schema design;
- Experience working with AWS services such as Glue, Aurora, and Athena, and Snowflake;
- Familiarity with CI/CD pipelines and Infrastructure as Code using Terraform;
- Experience building and consuming REST APIs;
- Strong analytical skills with exceptional attention to detail, including data auditing and validation;
- Upper-intermediate English level.
Nice to haves
- FinOps Practitioner or FinOps Engineer certification;
- Experience with Scala;
- Hands-on experience with cloud platforms such as AWS and GCP, and observability tools such as Datadog;
- Experience working with large-scale datasets and distributed systems;
- Strong understanding of cloud cost optimization strategies;
- Experience in highly data-driven, cross-functional environments.
We are looking for a Senior Data Engineer to build and maintain the data infrastructure that powers cost attribution, financial modeling, and cloud optimization initiatives. You will design scalable ELT/ETL pipelines and data models using Python, SQL, dbt, and Airflow, working across AWS services including Glue, Aurora, and Athena alongside Snowflake. The role sits at the intersection of engineering and finance, requiring close collaboration with Product, Engineering, and Analytics teams.
About the role
The benefits of joining us
Professional growth
Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps
Competitive compensation
We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities
A selection of exciting projects
Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands
Flextime
Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.
Your AgileEngine journey starts here
2 min
Tell us about yourself
2 sec
Confirm requirements
30 - 60 min
Pass a short test
5 min
Record a short video
→ Introduce yourself on a video, instead of waiting for an interview
Live interview
Ace the technical interview with our team
→ Schedule a call yourself right away after your video is reviewed
Live interview
Final interview with your team
→ Get to know the team you will be working with
Get an offer
As quick as possible
