Python & Sql Data Engineer (Middle / Senior)

Detalles de la oferta

Job Description

AgileEngine is one of the Inc. 5000 fastest-growing companies in the U and a top-3 ranked dev shop according to Clutch. We create award-winning custom software solutions that help companies across 15+ industries change the lives of millions

If you like a challenging environment where you're working with the best and are encouraged to learn and experiment every day, there's no better place - guaranteed! : )

What you will do

- Build and optimize data pipelines to ensure reliable integration and transformation;
- Design and maintain Snowflake tables and data models for analytics and reporting;
- Enhance data observability to proactively identify and resolve issues;

- Create dashboards and workflows to deliver business insights;

Collaborate with stakeholders to ensure project transparency and success.

Must haves

- 3 years of experience in data modeling principles / methods, especially Kimball ;
- Strong proficiency in SQL and experience working with Snowflake or similar cloud-based data warehouses;
- Strong proficiency in Python for data processing and automation;
- Hands-on experience building ETL pipelines using Apache Airflow or similar orchestration tools;
- Hands-on experience using DBT for data transformation, modeling, and testing;
- Strong experience consuming APIs and integrating data with Snowflake;
- Ability to develop simple Tableau dashboards;
- Proficiency in GitLab CI / CD workflows for continuous integration and deployment;
- Familiarity with AWS services, especially S3 ;
- Strong analytical and problem-solving skills;
- Innovative and productivity-driven mindset – willingness to improve and automate processes;
- Ability to adapt to business needs and stakeholder requirements;
- Ability to communicate work clearly;

Upper-Intermediate English level.

Nice to haves


- Basic knowledge of local development using Docker;

Tableau or any BI development experience.

The benefits of joining us

Professional growth

Accelerate your professional journey with mentorship, TechTalks, and personalized growth roadmaps.

Competitive compensation

We match your ever-growing skills, talent, and contributions with competitive USD-based compensation and budgets for education, fitness, and team activities.

A selection of exciting projects

Join projects with modern solutions development and top-tier clients that include Fortune 500 enterprises and leading product brands.

Flextime

Tailor your schedule for an optimal work-life balance, by having the options of working from home and going to the office – whatever makes you the happiest and most productive.

Next Steps After You Apply


The next steps of your journey will be shared via email within a few hours . Please check your inbox regularly and watch for updates from our Internal Applicant site, LaunchPod , which will guide you through the process.

Requirements

- 3 years of experience in data modeling principles / methods, especially Kimball; Strong proficiency in SQL and experience working with Snowflake or similar cloud-based data warehouses; Strong proficiency in Python for data processing and automation; Hands-on experience building ETL pipelines using Apache Airflow or similar orchestration tools; Hands-on experience using DBT for data transformation, modeling,
and testing; Strong experience consuming APIs and integrating data with Snowflake; Ability to develop simple Tableau dashboards; Proficiency in GitLab CI / CD workflows for continuous integration and deployment; Familiarity with AWS services, especially S3; Strong analytical and problem-solving skills; Innovative and productivity-driven mindset – willingness to improve and automate processes; Ability to adapt to business needs and stakeholder requirements; Ability to communicate work clearly; Upper-Intermediate English level.


Salario Nominal: A convenir

Fuente: Kitempleo

Requisitos

Built at: 2025-03-21T22:39:28.668Z