This role is for one of the Weekday's clients
Min Experience: 4 years Location: India JobType: full-time We are seeking an experienced Data Engineer with strong expertise in Databricks and modern data engineering practices. The ideal candidate will have 4+ years of hands-on experience in developing scalable data pipelines, managing distributed data systems, and supporting end-to-end CI/CD processes. This role involves architecting and optimizing data workflows that enable seamless data-driven decision-making across the organization. Responsibilities Design, build, and maintain scalable ETL/ELT pipelines for large-scale datasets using Spark, Hive, or Glue. Develop and optimize data integration workflows using ETL tools such as Informatica, Talend, or SSIS. Write, optimize, and maintain complex SQL queries for data transformation and analytics. Collaborate with cross-functional teams including data scientists, analysts, and product stakeholders to translate requirements into technical solutions. Deploy data workflows using CI/CD pipelines and ensure smooth automated releases. Monitor and optimize data workflows for performance, scalability, and reliability. Ensure data accuracy, governance, security, and compliance across pipelines. Work with cloud-based data platforms such as Azure (ADF, Synapse, Databricks) or AWS (EMR, Glue, S3, Athena). Maintain clear documentation of data systems, architectures, and processes. Provide mentorship and technical guidance to junior team members. Stay current with emerging data engineering tools, technologies, and best practices. What You’ll Bring Bachelor’s degree in IT, Computer Science, or related field. 4+ years of experience in data engineering and distributed data processing. Strong hands-on experience with Databricks or equivalent technologies (Spark, EMR, Hadoop). Proficiency in Python or Scala. Experience with modern data warehouses (Snowflake, Redshift, Oracle). Solid understanding of distributed storage systems (HDFS, ADLS, S3) and formats such as Parquet and ORC. Familiarity with orchestration tools such as ADF, Airflow, or Step Functions. Databricks Data Engineering Professional certification (preferred / required as needed). Experience in multi-cloud or migration-based projects is a plus.
Get similar opportunities delivered to your inbox. Free, no account needed!
You're currently viewing 1 out of 20,691 available remote opportunities
🔒 20,690 more jobs are waiting for you
Access every remote opportunity
Find your perfect match faster
New opportunities every day
Never miss an opportunity
Join thousands of remote workers who found their dream job
Premium members get unlimited access to all remote job listings, advanced search filters, job alerts, and the ability to save favorite jobs.
Yes! You can cancel your subscription at any time from your account settings. You'll continue to have access until the end of your billing period.
We offer a 7-day money-back guarantee on all plans. If you're not satisfied, contact us within 7 days for a full refund.
Absolutely! We use Stripe for payment processing, which is trusted by millions of businesses worldwide. We never store your payment information.