Define the target ELT architecture using dbt on Snowflake, integrated with Azure services (ADF, ADLS, Synapse/Databricks, Key Vault, Azure DevOps/GitHub). Translate legacy Informatica mappings, workflows, and sessions into modular dbt models (staging, core, mart layers). Establish modeling standards (naming conventions, layer design, folder/package structure) for staging, integration, and mart layers. Define and implement performance-optimized patterns in dbt and Snowflake (incremental models, clustering, partitioning logic, query tuning). Lead the migration strategy, roadmap, and wave planning for converting Informatica jobs to dbt on Snowflake. Analyze existing ETL logic, dependencies, and schedules in Informatica and design equivalent or improved logic in dbt. Design a repeatable migration factory: templates, accelerators, mapping spreadsheets, and conversion playbooks for Informatica → dbt. Oversee conversion, unit testing, and parallel runs to validate that dbt models match legacy outputs (row counts, aggregates, business rules). Lead hands-on development of dbt models, seeds, snapshots, tests, macros, and documentation. Define and implement testing strategy using dbt tests (schema tests, data tests, custom tests) and integrate with broader data quality checks. Set up and maintain dbt environments (dev/test/prod), profiles, and connections to Snowflake on Azure. Introduce and enforce code quality practices: Code reviews & pull requests Modular, reusable models and packages 8+ years total experience in Data Engineering / ETL / Data Warehousing. 3+ years hands-on experience with dbt (Core or Cloud) building production-grade pipelines. Proven experience leading an Informatica → dbt migration to Snowflake on Azure (or similar large-scale ETL modernization). Strong Snowflake experience: designing and developing schemas, views, warehouses, and performance optimization. Solid working knowledge of Azure data stack: Azure Data Factory, ADLS, Azure DevOps/GitHub. Very strong SQL skills and understanding of ELT patterns vs traditional ETL. Deep understanding of dbt concepts: Models, sources, seeds, snapshots Jinja/macros, packages, exposures Tests, documentation, dbt Cloud or dbt Core CLI Strong knowledge of Informatica PowerCenter/IDQ objects (mappings, workflows, sessions) and how to map them to dbt patterns. Experience setting up and managing CI/CD for dbt projects. Familiarity with Python and/or Spark is a plus for complex transformations or auxiliary tooling. Experience connecting BI tools (Power BI, Tableau, Looker, etc.) to Snowflake models and semantic layers. Strong communication skills to explain migration decisions, trade-offs, and impacts to both technical and business stakeholders. Ability to work in Agile/Scrum environments and manage competing priorities. Strong analytical and problem-solving mindset, with focus on automation and standardization.
Get similar opportunities delivered to your inbox. Free, no account needed!
You're currently viewing 1 out of 32,584 available remote opportunities
🔒 32,583 more jobs are waiting for you
Access every remote opportunity
Find your perfect match faster
New opportunities every day
Never miss an opportunity
Join thousands of remote workers who found their dream job
Premium members get unlimited access to all remote job listings, advanced search filters, job alerts, and the ability to save favorite jobs.
Yes! You can cancel your subscription at any time from your account settings. You'll continue to have access until the end of your billing period.
We offer a 7-day money-back guarantee on all plans. If you're not satisfied, contact us within 7 days for a full refund.
Absolutely! We use Stripe for payment processing, which is trusted by millions of businesses worldwide. We never store your payment information.