Explore the latest Engineer job openings in India. Discover top companies hiring for Engineer roles across major cities in India and take the next step in your career.
Job Title: Senior Azure Data Engineer
Job Location: Remote
Skills Required: Azure Synapse, Azure Data Factory, Azure Databricks, and ADLS Gen2, PySpark, SQL,Microsoft Fabrics.
Shift Timings: General Shift
Experience Range: 7-12 Years
Job Summary:
We are seeking a highly skilled and experienced Senior Azure Data Engineer to join our team. The ideal candidate will have deep expertise in Microsoft Azure data services, cloud-based data engineering solutions and modern reporting tools. This role involves leading technical projects, designing scalable data architectures, and delivering actionable insights through BI tools using modern data Lakehouse/warehouse architecture. Ideal candidate should have strong experience with data modelling , ETL/ELT pipeline, and performance tuning. Need to work with Data Architect , Business Analysts and Data Scientists to deliver high quality , reliable and secure data products that enabled advanced analytics.
Technical Skills Required:
Cloud & Data Platforms: Microsoft Azure (Microsoft Fabric, ADF, Synapse, Databricks, ADLS Gen2, Azure SQL, Azure DevOps, Azure Event Hub)
BI Tools: Power BI, Power Automate, Power Apps
Programming Languages: PySpark,Scala,C#, SQL
Databases: SQL Server, Oracle, Snowflake
Version Control: GitHub, Azure DevOps
Other Tools: MS Office Suite, Lucidchart, Draw.io
Please share your resume at with current CTC, expected CTC, and notice period.
Programmers.io
Today
Greetings From TCS!
Azure Data Engineer
5 to 10 years
PAN India
Python/Pyspark Experience developing in Azure with key data technologies (e.g. ADLS, ADF, Azure Databricks etc.)
Software development methodologies
Experience working with at least one DevOps tool (GIT, Azure DevOps, Maven, Jenkins)
Prior roles that demonstrate utilisation of high-quality Agile development best practices.
In depth knowledge of at least one scheduling tool (e.g. Control-M, Oozie, ADF, etc.)
Follow me for more Job updates!
Tata Consultancy Services
Today
Responsibilities:
1.Pre-sales Technical Requirement Communication: Participate in requirement research and on-site surveys, output electrical solutions, equipment lists (BOM), and cost budget to support bidding and contract signing.
2.Solution Design: Develop integrated control electrical architectures (equipment layer - control layer - scheduling layer - management layer), complete PLC selection, design of electrical schematics, wiring diagrams, and layout diagrams, and write control logic and HMI interfaces.
3.System Integration: Responsible for the electrical integration of PLCs, frequency converters, servos, sensors, and logistics and warehousing automation equipment (AGV/stacker crane/sorting machine), ensuring communication protocol compatibility and data interoperability.
4.Commissioning and Acceptance: Conduct on-site commissioning of electrical systems, resolve linkage and fault issues, assist customers in acceptance, and provide operational training and technical documentation.
5.Operation and Maintenance Support: Responsible for the maintenance, upgrade, and fault troubleshooting of electrical systems in the later stage of the project to ensure long-term stable operation.
6.Timely handle and accurately execute other work tasks assigned by departments and superiors.
Qualifications:
1.Bachelor's degree or above, majoring in Electrical Engineering, Automation, Mechatronics, or related fields.
2.At least 2 years of experience in PLC integration and programming for logistics automation projects; familiar with programming software of mainstream PLC brands (e.g., Siemens); experienced in debugging and fault troubleshooting of logistics equipments.
3.Possess electrical solution design capabilities; able to formulate four-layer architecture schemes according to customer requirements, complete PLC selection, and design of electrical schematics (EPLAN), wiring diagrams, and layout diagrams.
4.Excellent problem-solving ability; customer-oriented, proactively optimize processes.
5.Flexible and efficient team collaboration ability; excellent time management skills; able to accept urgent tasks and strictly comply with safety regulations.
6.English can be used as a working language; able to adapt to business travel.
Wayzim Technology Co., Ltd.
Today
Organization:
At CommBank, we never lose sight of the role we play in other people's financial wellbeing. Our focus is to help people and businesses move forward to progress. To make the right financial decisions and achieve their dreams, targets, and aspirations. Regardless of where you work within our organisation, your initiative, talent, ideas, and energy all contribute to the impact that we can make with our work. Together we can achieve great things.
Job Title: Senior Platform Engineer - Oracle EPM
Location: Bangalore
Business & Team: Corporate Technology
Impact & contribution:
Join a visionary team driving the evolution of Finance Systems, with a focus on Oracle EPM SaaS and cloud-native platforms. Architect, build, and operate robust automation and CI/CD pipelines, enabling seamless integration and delivery across AWS and Oracle Cloud environments.
Roles & Responsibilities:
Essential Skills:
Educational Qualifications: Bachelor of Engineering or B.Tech
Commonwealth Bank
Today
OpenShift Administrator (L1 & L3) - Hiring Requirement Belapur, Navi Mumbai.
Simple Logic is hiring for experienced OpenShift Administrators (L1 & L3) to join our team in Belapur, Navi Mumbai (Work From Office).
Position : OpenShift Administrator (L1 & L3)
Experience : 3-12 Years
Location : Belapur - Navi Mumbai
Notice Period : Immediate to 30 Days (Preferred)
Job Summary:
We are seeking skilled OpenShift professionals with a strong Linux background and hands-on experience in managing enterprise-grade OpenShift (OCP v4) environments on VMware-based on-prem infrastructure. The role involves cluster management, automation, troubleshooting, and collaboration with cross-functional teams.
Key Responsibilities:
Additional Responsibilities for L3 / Team Lead:
Required Skills:
Good to Have:
Apply on
Simple Logic IT Private Limited
Today
Job Description
To develop web application software as member of product team. Take up ownership of assigned project tasks and collaborate with team members for version releases. Job location is Bangalore. Ability to join within 2 weeks will be an advantage.
Experience & Knowledge
3 to 5 years experience in web application development using ASP.NET Core API framework, Entity Framework and PostgreSQL Database technology stack. Good understanding of architecture of API oriented web application. Experience in UI development using Next.js/React will be big advantage.
Education
Minimum BE/ B.Tech/ MCA in Computer science or equivalent.
Aurbis Business Parks
Today
Position Responsibilities
Qualifications and Experience Requirement
Qualifications Essential
Skills
RBL Bank
Today
Greetings from TCS
TCS is hiring for GCP Data Engineer
Required Skill Set: GCP, Bigquery, python, pyspark
Desired Experience Range: 6 to 8 Years
Job Location: Bangalore / Chennai / Hyderabad
Key responsibilities:
As a GCP data engineer the colleague should be able to designs scalable data architectures on Google Cloud Platform, using services like Big Query and Dataflow. They write and maintain code (Python, Java), ensuring efficient data models and seamless ETL processes. Quality checks and governance are implemented to maintain accurate and reliable data.
Security is a priority, enforcing measures for storage, transmission, and processing, while ensuring compliance with data protection standards. Collaboration with cross-functional teams is key for understanding diverse data requirements. Comprehensive documentation is maintained for data processes, pipelines, and architectures.
Responsibilities extend to optimizing data pipelines and queries for performance, troubleshooting issues, and proactively monitoring data accuracy. Continuous learning is emphasized to stay updated on GCP features and industry best practices, ensuring a current and effective data engineering approach.
Key Skills:
- Proficiency in programming languages: Python, Java
- Expertise in data processing frameworks: Apache Beam (Data Flow)
- Hands-on experience with GCP services: Big Query, Dataflow
- Knowledge of data modeling and database design
- Experience in ETL (Extract, Transform, Load) processes
- Familiarity with cloud storage solutions
- Strong problem-solving abilities in data engineering challenges
- Understanding of data security and scalability
- Proficiency in relevant tools like Apache Airflow
Thank you
Bodhisatwa Ray
Tata Consultancy Services
Today
We are hiring for Data Engineer AZURE
experience - 8+yrs
Location- Pune / Bangalore
Must-Have
(Ideally should not be more than 3-5)
Good-to-Have
Tata Consultancy Services
Today
Position Responsibilities
Qualifications and Experience Requirement
Qualifications Essential
Skills
RBL Bank
Today
We're Hiring Data Engineer - Databricks
Locations: Bangalore Chennai Kochi
Experience: 5 to 8 Years
Are you a Data Engineer passionate about building scalable data pipelines and working with modern cloud data platforms? We're looking for talented professionals to join a high-impact data engineering team working on large-scale, cloud-native data solutions.
Key Responsibilities
Must-Have Skills
Good-to-Have
Tata Consultancy Services
Today
ANSR is hiring for one of its clients.
ABOUT UNDER ARMOUR:
Our story is a classic American entrepreneurial journey, born from a simple, unmet need identified by an athlete. In 1996, founder Kevin Plank, then captain of the University of Maryland football team, set out to solve the problem of sweat-soaked cotton shirts by creating a moisture-wicking synthetic alternative. That innovation - HeatGear - redefined the athletic base layer and laid the foundation for what is today a global leader in performance apparel, footwear, and accessories.
Headquartered in Baltimore, Maryland, Under Armour is driven by a clear mission - to make all athletes better. This relentless pursuit of better defines who we are, shaping our focus on performance, innovation, and continuous improvement.
OUR PURPOSE:
Building on this global foundation, Under Armour India is purpose-built to access India's top-tier talent and embed the brand's culture into scalable, tech-driven solutions. At its core, this is about empowering those who strive for more - a belief reflected in our values: Act Sustainably, Celebrate the Wins, Fight on Together, Love Athletes, and Stand for Equality.
These values serve as a shared framework that guides how we think, build, and collaborate. They connect teams across geographies, reinforce our purpose, and ensure that everything we do is aligned to a common goal - enabling better outcomes for athletes and the business.
At Under Armour India, this translates into an environment where individuals have the freedom to go further, regardless of role. Teams are empowered to develop and deliver state-of-the-art products and digital solutions that enhance performance and drive impact at scale.
VALUES & INNOVATION:
Across Under Armour globally, our values act as the thread that unites every teammate, shaping a culture grounded in purpose, accountability, and shared ambition. They are not just principles, but active drivers of how we operate, innovate, and grow together.
This culture is deeply anchored in innovation - a continuous pursuit of better that pushes boundaries and challenges convention. Whether through product innovation or digital transformation, teams are enabled to create solutions that elevate performance and redefine possibilities for athletes everywhere.
PURPOSE OF ROLE:
The SAP Integration Lead will support all functional areas of our growing, global business which runs SAP's Fashion Management Solution. Candidate will design and craft enterprise-capable applications and develop solutions quickly by possessing a deep understanding of the business process and using all of our toolsets to deliver solutions to the business
This position will support the integration and data flow between our SAP system and multiple internal and external systems and partners. This position will be developing and maintaining these interfaces using SAP BTP Integration Suite and SAP PI/PO.
YOUR IMPACT (Job Responsibilities)
QUALIFICATIONS:
WORKPLACE LOCATION:
OUR COMMITMENT TO EQUAL OPPORTUNITY:
At Under Armour, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, color, religion or belief, sex, pregnancy (including childbirth, lactation and related medical conditions), national origin, age, physical and mental disability, marital status, sexual orientation, gender identity, gender expression, genetic information (including characteristics and testing), military and veteran status, family or paternal status and any other characteristic protected by applicable law. Under Armour seeks to recruit, develop and retain the most talented people representing a wide variety of backgrounds and perspectives. Reasonable accommodations are available for applicants with disabilities upon request.
ANSR
Today
About Role:
We are supporting a global financial data environment, focusing on Market Data infrastructure.
The role involves on-site support for locations in multiple clients within banking environments.
Requirements
Working model
Who are we?
Amaris Consulting is an independent technology consulting firm providing guidance and solutions to businesses. With more than 1,000 clients across the globe, we have been rolling out solutions in major projects for over a decade. This is made possible by an international team of 7,600 people across 5 continents and more than 60 countries.
Our expertise is built around four Business Lines: Information System & Digital, Telecom, Life Sciences, and Engineering. We focus on building and nurturing a strong talent community where everyone can grow, evolve, and reach their full potential.
At Amaris, we care about the candidate experience. We aim to be transparent, fast, and constructive, giving clear feedback at every step.
What we offer
Recruitment process
Amaris Consulting
Today
Key Deliverables:
Larsen & Toubro
Today
Namaskaram!
We are seeking a skilled and detail-oriented QA Automation Engineer with 6-8 years of experience in automation testing. You will play a crucial role in ensuring the quality and performance of our web and mobile applications through robust test automation strategies, continuous integration, and close collaboration with cross-functional teams.
Key Responsibilities:
Required Skills:
Good to Have:
Before applying, please go through the link below to learn more about us.
Make in India Initiative - AjnaLens featured under the Make in India initiative - Video: "Make in India" -
Company & Product Overview -
Major Product Launches & Milestones - CES Launch - "AjnaLens at CES 2023 Launching Industry's Most Advanced True Mixed Reality Headset" - Video:
AjnaXR Pro Launch - Official Announcement:
Company LinkedIn -
AjnaLens
Today
Talent500
Today
Role: Machine Learning Engineering Lead
Location: Thane, Mumbai
Type: Full-Time opportunity
Exp level: 2+ Years
Required Skills & Experience Must Have
• 3+ years building ML/NLP evaluation pipelines in Python
• LLM evaluation techniques: LLM-as-judge, semantic similarity, embedding search, red-team dataset curation
• Experience designing abstraction layers for heterogeneous data sources (different APIs, formats, completeness levels)
• Adversarial robustness testing for LLMs (prompt injection, jailbreaking, information extraction)
• AWS services: ECS/Fargate, S3, RDS, SQS, CloudWatch
• Practical implementation of AI ethics, fairness, and bias metrics (not just awareness)
• CI/CD with security-integrated workflows (SAST, dependency scanning)
• Willingness to undergo BPSS-equivalent vetting Strong Preference
• Experience with Responsible AI / AI governance frameworks (NIST AI RMF, ISO 42001, EU AI Act tooling, UK CDEI)
• Government or regulated-sector compliance experience (data residency, audit trails, clearance constraints)
• LLM observability platforms (Langfuse, LangSmith, or similar)
• Multi-tenant or multi-supplier evaluation platform experience
• Published or presented on AI safety, alignment, or Responsible AI measurement
• Led or mentored a small ML team (2-5 people)
SkillRecruit
Today
Immediate Joiners Only
Duration: Full-time
Location: Sahakarnagar, Bengaluru
Qualification: Master's degree in Optics, Optical Engineering, Applied Physics, or a related field with 3+ years of relevant opto-mechanical systems development experience
Job Responsibilities
• Optical System Design & Analysis
Lead design, modelling, and performance analysis of optical assemblies for space laser communication payloads.
Conduct tolerance analysis, stray light analysis, and system performance predictions using Zemax, Code V, or equivalent tools.
Optimize optical architectures for manufacturability, reliability, and mission constraints.
• Integration & Qualification
Own alignment and testing of optical systems through prototyping, environmental testing, and final qualification.
Drive troubleshooting and issue resolution during critical development phases under aggressive timelines.
• Component Definition & Procurement
Develop detailed specifications for lenses, coatings, detectors, and related subsystems.
Partner with vendors and suppliers to ensure delivery of high-quality, space-qualified components.
• Cross-functional Collaboration
Work with mechanical, avionics, and systems teams to ensure seamless integration of optics in the final system.
Provide technical leadership in design reviews and mentor junior engineers on best practices.
• Documentation & Delivery
Prepare design reports, qualification documents, and test results with a high degree of clarity and detail.
Ensure deliverables meet technical and programmatic deadlines.
Required Skills/Experience
• Master's degree in Optics, Optical Engineering, Applied Physics, or a related discipline.
• 3+ years of industry experience in optical system design, analysis, and testing (aerospace, satellite, or high-precision optics preferred).
• Hands-on expertise with optical design software (Zemax, Code V or equivalent).
• Strong knowledge of optical alignment, tolerance analysis, and space environmental factors.
• Good understanding of mechanical design principles & geometric dimensioning & tolerancing (GD&T)
• Practical experience with optical metrology, alignment techniques, system integration, and qualification testing.
• Good documentation & report writing skills
• Proven ability to deliver under tight schedules and demanding program milestones.
Desired Skills/Experience
• Experience with space-qualified materials, adhesives, and opto-mechanical systems.
• Familiarity with MATLAB or Python for data analysis and test automation.
• Experience working with optical benches, collimators, beam profilers etc.
• Knowledge of space optical systems, machining and integration processes
• Strong vendor management and cross-functional collaboration skills.
Other Requirements
• Motivated to work within a multidisciplinary team in a fast-paced environment
• Self-driven desire to learn more & continuously upskill themselves
• Ability to deal with challenges, and work out solutions in a team setting
• Critical thinking, ability to critique design and make independent decisions
• Relentless pursuit of excellence in all aspects of work
• Strong interpersonal and organization skills to enable working in a dynamic team
• Strong verbal and written communication skills
Benefits
• Health Insurance coverage
• Flexible working hours
• Opportunity to publish research on behalf of Astrogate Labs in national and international conferences
If you like working in a fast-paced environment with a talented multi-disciplinary team on developing and maturing state-of-the-art LaserComm systems, then this role might be for you.
Astrogate Labs is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
About Us
Astrogate Labs is a space-technology start-up based in Bengaluru, building core technologies in laser communication terminals and networks for small-satellites. Astrogate Labs aims to simplify, reduce cost of communication, and enable satellites send more data to ground. We have developed one of the industry's smallest optical communication terminals targeted for small satellite use, further scaling our products and aiming for our first satellite mission. We aim to revolutionize satellite communications and support the growing satellite downlink needs with a network of optical ground stations and in-space relays using the technologies developed in-house.
Why Join Astrogate Labs?
• Work on cutting-edge technology: Build laser communication systems that are 100x faster than traditional RF, deployed on satellites and defense platforms and the only scalable solution for building inter-planetary internet
• Real impact: Your work will directly contribute to India's space capabilities and national security applications
• Ownership: In a lean & agile team, you'll have significant responsibility and visibility across the entire product lifecycle
• Learning: Work alongside experienced aerospace engineers and advisors
• Growth: Be part of a rapidly scaling startup backed by marquee investors, with clear pathways to leadership roles
Our Culture
We're a startup, which means hustle is part of the DNA. Expect fast iterations and the need to be resourceful. However, we balance this with deep technical rigor - we're building systems that can't afford to fail.
We value:
• Technical excellence with a bias for action
• Collaborative problem-solving across disciplines
• Curiosity and continuous learning
Pragmatism balanced with innovation
Astrogate Labs
Today
ANSR is hiring for one of its clients.
ABOUT UNDER ARMOUR:
Our story is a classic American entrepreneurial journey, born from a simple, unmet need identified by an athlete. In 1996, founder Kevin Plank, then captain of the University of Maryland football team, set out to solve the problem of sweat-soaked cotton shirts by creating a moisture-wicking synthetic alternative. That innovation - HeatGear - redefined the athletic base layer and laid the foundation for what is today a global leader in performance apparel, footwear, and accessories.
Headquartered in Baltimore, Maryland, Under Armour is driven by a clear mission - to make all athletes better. This relentless pursuit of better defines who we are, shaping our focus on performance, innovation, and continuous improvement.
OUR PURPOSE:
Building on this global foundation, Under Armour India is purpose-built to access India's top-tier talent and embed the brand's culture into scalable, tech-driven solutions. At its core, this is about empowering those who strive for more - a belief reflected in our values: Act Sustainably, Celebrate the Wins, Fight on Together, Love Athletes, and Stand for Equality.
These values serve as a shared framework that guides how we think, build, and collaborate. They connect teams across geographies, reinforce our purpose, and ensure that everything we do is aligned to a common goal - enabling better outcomes for athletes and the business.
At Under Armour India, this translates into an environment where individuals have the freedom to go further, regardless of role. Teams are empowered to develop and deliver state-of-the-art products and digital solutions that enhance performance and drive impact at scale.
VALUES & INNOVATION:
Across Under Armour globally, our values act as the thread that unites every teammate, shaping a culture grounded in purpose, accountability, and shared ambition. They are not just principles, but active drivers of how we operate, innovate, and grow together.
This culture is deeply anchored in innovation - a continuous pursuit of better that pushes boundaries and challenges convention. Whether through product innovation or digital transformation, teams are enabled to create solutions that elevate performance and redefine possibilities for athletes everywhere.
PURPOSE OF ROLE:
The SAP Basis Lead will support all technical areas of our growing, global business which runs on SAP's Fashion Management Solution. Candidate will be responsible for the availability and guaranteed uptime of all enterprise-capable systems, applications and solutions. The candidate must possess a deep understanding of all SAP's tool sets to deliver stable and trusted SAP solutions to the business.
This position will be the go-to person for all escalations for the SAP system landscape that supports the data flow between our SAP systems and multiple internal and external systems and partners. This position will be responsible for maintaining these interfaces across the SAP ecosystem.
YOUR IMPACT (Job Responsibilities):
QUALIFICATIONS:
WORKPLACE LOCATION:
OUR COMMITMENT TO EQUAL OPPORTUNITY:
At Under Armour, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, color, religion or belief, sex, pregnancy (including childbirth, lactation and related medical conditions), national origin, age, physical and mental disability, marital status, sexual orientation, gender identity, gender expression, genetic information (including characteristics and testing), military and veteran status, family or paternal status and any other characteristic protected by applicable law. Under Armour seeks to recruit, develop and retain the most talented people representing a wide variety of backgrounds and perspectives. Reasonable accommodations are available for applicants with disabilities upon request.
ANSR
Today
ANSR is hiring for one of its clients.
ABOUT UNDER ARMOUR:
Our story is a classic American entrepreneurial journey, born from a simple, unmet need identified by an athlete. In 1996, founder Kevin Plank, then captain of the University of Maryland football team, set out to solve the problem of sweat-soaked cotton shirts by creating a moisture-wicking synthetic alternative. That innovation - HeatGear - redefined the athletic base layer and laid the foundation for what is today a global leader in performance apparel, footwear, and accessories.
Headquartered in Baltimore, Maryland, Under Armour is driven by a clear mission - to make all athletes better. This relentless pursuit of better defines who we are, shaping our focus on performance, innovation, and continuous improvement.
OUR PURPOSE:
Building on this global foundation, Under Armour India is purpose-built to access India's top-tier talent and embed the brand's culture into scalable, tech-driven solutions. At its core, this is about empowering those who strive for more - a belief reflected in our values: Act Sustainably, Celebrate the Wins, Fight on Together, Love Athletes, and Stand for Equality.
These values serve as a shared framework that guides how we think, build, and collaborate. They connect teams across geographies, reinforce our purpose, and ensure that everything we do is aligned to a common goal - enabling better outcomes for athletes and the business.
At Under Armour India, this translates into an environment where individuals have the freedom to go further, regardless of role. Teams are empowered to develop and deliver state-of-the-art products and digital solutions that enhance performance and drive impact at scale.
VALUES & INNOVATION:
Across Under Armour globally, our values act as the thread that unites every teammate, shaping a culture grounded in purpose, accountability, and shared ambition. They are not just principles, but active drivers of how we operate, innovate, and grow together.
This culture is deeply anchored in innovation - a continuous pursuit of better that pushes boundaries and challenges convention. Whether through product innovation or digital transformation, teams are enabled to create solutions that elevate performance and redefine possibilities for athletes everywhere.
PURPOSE OF ROLE:
The Site Reliability Engineering (SRE) team at Under Armour drives continuous improvements in performance, resiliency, and operational excellence across our technology platforms. We take a consultative, engineering first approach to reliability- partnering closely with cross functional teams to deliver guidance, automation, and best practices that improve the scalability, stability, and reliability of the services that power our products and digital experiences.
We are seeking a Site Reliability Engineer to help strengthen the reliability and scalability of critical systems. In this role, you will build automation, enhance observability, improve operational workflows, and participate in incident response and problem management. The ideal candidate brings a strong foundation in distributed systems, cloud native platforms, and performance optimization, along with a collaborative mindset and a passion for applying SRE principles across the organization.
Innovation is a core part of how we work at Under Armour. Success in this role requires adaptability, continuous learning, and the ability to pivot as technologies, priorities, and business needs evolve.
YOUR IMPACT (Job Responsibilities):
QUALIFICATIONS:
PREFFERED QUALIFICATIONS:
WORKPLACE LOCATION:
OUR COMMITMENT TO EQUAL OPPORTUNITY:
At Under Armour, we are committed to providing an environment of mutual respect where equal employment opportunities are available to all applicants and teammates without regard to race, color, religion or belief, sex, pregnancy (including childbirth, lactation and related medical conditions), national origin, age, physical and mental disability, marital status, sexual orientation, gender identity, gender expression, genetic information (including characteristics and testing), military and veteran status, family or paternal status and any other characteristic protected by applicable law. Under Armour seeks to recruit, develop and retain the most talented people representing a wide variety of backgrounds and perspectives. Reasonable accommodations are available for applicants with disabilities upon request.
ANSR
Today
Launched in 2019 by Landmark Group, Styli Marketplace is the first e-commerce venture of the group, quickly becoming a leading online destination for fashion and lifestyle across the GCC, including Saudi Arabia, the UAE, Kuwait, Bahrain, and beyond. Styli connects global sellers and creators with millions of fashion-forward customers, offering the latest trends, exceptional value, and convenient services like same-day to 48-hour delivery and flexible payment options.
What Are We Looking For?
We are looking for a Site Reliability Engineer who combines deep engineering instincts with an operational mindset.
Qualifications Must-Have
3-6 years of hands-on SRE experience in a high-traffic, consumer-facing product environment.
Strong proficiency with GCP services (GKE, Cloud Run, Pub/Sub, Cloud SQL, Cloud Armor, GCS, IAM).
Deep operational expertise in Kubernetes - from cluster administration to workload tuning and troubleshooting.
Experience with Terraform and Helm for infrastructure and application lifecycle management.
Proficiency in at least one scripting/programming language - Python, Go, or Bash - for automation and tooling.
Hands-on experience with CI/CD platforms and GitOps practices.
Strong understanding of distributed systems, networking fundamentals (DNS, TLS, TCP/IP, HTTP/2), and database reliability patterns
Landmark Group
Today
Job Overview
The purpose of this role is to cover primarily the tasks focused on network operations, which include managing and supporting the daily tasks related to Network, such as Firewall, Sdwan, WAF, ADC, Switches maintenance.
The role will require managing a network devices, monitor the network and support L2-L3 incidents and tickets, for customers abroad .
The role will also include performing improvements and applying standard best practices to network devices configuration.
Key Responsibilities
Minimum requirements of role
Essential:
Communication and Soft Skills
Qualifications
Manufacturers knowledge:
Soffit Infrastructure Services (P) Ltd
Today
Job Title: Automation Engineer - Mechanical Design
Work Location: Bangalore
About the Company: Established in the year 2005, ZChem Specialties has come a longway to become a full-range NDT company in the areas of Liquid Penetrant Inspection (LPI) and Magnetic Particle Inspection (MPI). Promoted by two experienced Technocrats turned Entrepreneurs, it is one of the few companies in the world offering a complete range of products in consumables and equipment. With a manufacturing facility in Bangalore, India, ZChem currently operates in Pan India, the Middle East, ASEAN, and African countries.
Who Should Join:
Candidates with a technical Background and hunger to grow both in terms of career and personal life, thereby creating their mark in the industry.
Qualification
Experience
Skills and Competencies:
ZChem
Today
Job Summary :
We are looking for a skilled and detail-oriented SDET - Automation Engineer with 3 - 7 years of experience in QA and automation. The ideal candidate should have strong hands-on expertise in Python-based automation frameworks, UI & API testing, and experience working in Linux-based CI/CD environments. You will work closely with engineering, DevOps, and product teams to ensure high-quality and reliable software releases.
Core Responsibilities :
• Design, develop, and maintain scalable UI and API automation frameworks.
• Write clean, maintainable, and reusable automation scripts using Python.
• Execute automated test suites and analyze results to identify defects and gaps.
• Collaborate with developers and DevOps teams to integrate automated tests into CI/CD pipelines.
• Perform root cause analysis and debugging in Linux-based environments.
• Participate in test planning, test case design, and release validation activities.
• Track and manage defects using standard defect tracking tools.
• Continuously improve automation coverage, performance, and reliability.
Must-Have :
• 3 - 7 years of experience in QA / SDET roles.
• Minimum 2+ years of hands-on experience in Python programming and automation frameworks.
• Strong experience in UI and API automation framework development.
• Proficiency in Linux commands and debugging in CI/CD environments.
• Solid understanding of QA methodologies, test planning, and defect tracking processes.
• Good working knowledge of Docker and Kubernetes.
• Strong analytical, problem-solving, and troubleshooting skills.
• Excellent communication skills with the ability to collaborate effectively across teams.
Job Location : Aundh, Pune
Employment Type : Full Time and Work From Office.
Geminus Tech
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
About the Role
We are seeking a highly experienced Lead Data Engineer to design, build, and automate scalable data platforms. The ideal candidate will have strong hands-on expertise in DBT, Fivetran, Amazon Redshift, Elementary (data observability), job automation tools, and modern development practices using VS Code and Git. Data Architect capability is preferred to help define long-term data platform strategy and standards.
Key Responsibilities
Lead the design and implementation of enterprise-scale data warehouse solutions using Amazon Redshift.
Architect and manage automated data ingestion pipelines using Fivetran.
Develop, test, version, and optimize data transformations using dbt.
Implement and manage data observability using Elementary to ensure data quality, freshness, and reliability.
Design and maintain job automation and scheduling frameworks for data pipelines.
Apply Git-based version control and collaborative development using VS Code.
Define CI/CD standards, conduct code reviews, and enforce best practices.
Contribute to data architecture decisions, including data modeling standards, scalability, and performance optimization.
Mentor team members and collaborate with stakeholders to deliver robust data solutions.
Required Skills & Qualifications
10+ years of experience in Data Engineering.
Mandatory Hands-on Experience with: Amazon Redshift or similar cloud data warehouses
dbt or similar data transformation frameworks.
Fivetran or similar ELT/data ingestion tools
Elementary or similar data observability tools
Automation and job scheduling tools (e.g., Airflow, dbt Cloud, AWS Step Functions, Control-M, or similar)
VS Code and Git or similar IDEs and version control system
Strong SQL skills and solid understanding of data warehousing concepts.
Experience working on cloud platforms (preferably AWS).
Strong leadership, communication, and stakeholder management skills.
Preferred Qualifications
Data Architect capability, including defining end-to-end data architecture and governance standards.
Experience with dimensional modeling, data lake/lakehouse architectures.
Python scripting for automation and data engineering tasks.
Exposure to BI and analytics tools (Tableau, Power BI, Looker).
Experience in enterprise-scale data platforms.
Experience in Agile processes and has a strong grasp of Software Development Lifecycle (SDLC) concepts.
Soft Skills & Behavioral Requirements
Self-learner with the ability to quickly adapt to new technologies and tools
Strong grasping ability with a clear understanding of complex systems and workflows
Excellent troubleshooting and root-cause analysis skills
Proactive problem-solver with a hands-on and ownership-driven mindset
Ability to work independently with minimal supervision
Azoca Technologies
Today
Azure Data Engineer (Contract) - INDIA - Fully Remote - 6 Months
We're hiring an experienced Azure Data Engineer to support a growing analytics platform and help scale modern data pipelines in a cloud-first environment. This is a hands-on role focused on building reliable data foundations that enable reporting, modelling and business insights.
What you'll do
What we're looking for
Details
This role suits someone who enjoys building clean, scalable pipelines and enabling the business to trust and use their data.
InterEx Group
Today
Azure Data Engineer (Contract) - INDIA - Fully Remote - 6 Months
We're hiring an experienced Azure Data Engineer to support a growing analytics platform and help scale modern data pipelines in a cloud-first environment. This is a hands-on role focused on building reliable data foundations that enable reporting, modelling and business insights.
What you'll do
What we're looking for
Details
This role suits someone who enjoys building clean, scalable pipelines and enabling the business to trust and use their data.
InterEx Group
Today
Azure Data Engineer (Contract) - INDIA - Fully Remote - 6 Months
We're hiring an experienced Azure Data Engineer to support a growing analytics platform and help scale modern data pipelines in a cloud-first environment. This is a hands-on role focused on building reliable data foundations that enable reporting, modelling and business insights.
What you'll do
What we're looking for
Details
This role suits someone who enjoys building clean, scalable pipelines and enabling the business to trust and use their data.
InterEx Group
Today
Azure Data Engineer (Contract) - INDIA - Fully Remote - 6 Months
We're hiring an experienced Azure Data Engineer to support a growing analytics platform and help scale modern data pipelines in a cloud-first environment. This is a hands-on role focused on building reliable data foundations that enable reporting, modelling and business insights.
What you'll do
What we're looking for
Details
This role suits someone who enjoys building clean, scalable pipelines and enabling the business to trust and use their data.
InterEx Group
Today
Azure Data Engineer (Contract) - INDIA - Fully Remote - 6 Months
We're hiring an experienced Azure Data Engineer to support a growing analytics platform and help scale modern data pipelines in a cloud-first environment. This is a hands-on role focused on building reliable data foundations that enable reporting, modelling and business insights.
What you'll do
What we're looking for
Details
This role suits someone who enjoys building clean, scalable pipelines and enabling the business to trust and use their data.
InterEx Group
Today
.
L&T-Sargent & Lundy Limited (CLS)
5 days ago
Job Title: Technical Support Engineer - Application Management
Experience Level: 3-7 Years
Key Responsibilities:
Application Deployment & Management:
Technical Support & Troubleshooting:
Security & Compliance:
HDFC securities
3 days ago
Role: Senior Infosec Engineer
Location: Bavdhan, Pune, Maharashtra, India (On-site)
Industry: Fintech / Financial Services
Role Summary
We are looking for a dynamic and hands-on Senior Infosec Engineer who will be responsible for leading and implementing our information security and cyber-risk strategy. The ideal candidate will have strong experience in fintech or financial services, knowledge of regulatory frameworks (like RBI, SEBI), and the ability to build secure digital systems while enabling growth and innovation.
Key Responsibilities
InCred Money
2 days ago
ANSR is hiring for one of its clients.
About American Airlines:
To Care for People on Life's Journey®. Together with our American Eagle regional partners, we offer thousands of flights daily to more than 350 destinations in more than 60 countries.
American Airlines is transforming the way it delivers technology to its customers and team members worldwide.
American’s Tech Hub in Hyderabad, India, is our latest technology office location and home to team members who drive technical innovation and engineer unrivalled digital products to best serve American’s customers and team members. With U.S. tech hubs in Dallas-Fort Worth, Texas and Phoenix, Arizona, our new team in Hyderabad, India enables better support of our 24/7 operation and positions American to deliver industry-leading technology solutions that create a world-class customer experience.
INTRO:
Are you ready to explore a world of possibilities, both at work and during your time off? Join our American Airlines family, and you’ll travel the world, grow your expertise, and become the best version of you. As you embark on a new journey, you’ll tackle challenges with flexibility and grace, learning new skills and advancing your career while having the time of your life. Feel free to enrich both your personal and work life and hop onboard!
WHY YOU’LL LOVE THIS JOB:
You’ll join a diverse, high performing engineeri
ANSR
3 days ago
Viraliq is a dynamic digital marketing agency dedicated to helping businesses thrive in the digital sphere. Specializing in social media management, performance marketing, AI-driven video creation, and lead generation, Viraliq empowers startups, local businesses, and brands to strengthen their online presence. With a focus on innovation, creativity, and data-driven strategies, the agency delivers impactful marketing campaigns that drive measurable growth. Based on combining technology with strategic marketing, Viraliq provides services like Google & Meta ads, local business marketing, and content creation to enhance brand visibility and engagement.
Role DescriptionThis is a full-time, on-site Technical Support Engineer role based in Chandigarh. As a Technical Support Engineer, you will be responsible for diagnosing and resolving technical issues reported by clients, providing effective troubleshooting, and delivering exceptional customer support. The role involves collaborating with team members to ensure customer satisfaction, documenting reported issues, and escalating complex problems to higher-level teams when necessary. This position emphasizes proactive problem-solving to maintain system functionality and optimize client experience.
QualificationsViraliq
3 days ago
Experience: 2-3 Years
Location: Bangalore/Hyderabad
Education: B.E/B.Tech in ECE/EEE or M.E/M.Tech in VLSI/Electronics
Roles and Responsibilities
Share the profiles to raksha.k@acldigital.com.
ACL Digital
3 days ago
Position
Group Manager
About JiBe
JiBe is a cloud based fully integrated ERP system for the shipping industry. Our goal is to allow shipping companies to improve productivity, efficiency and safety levels, while reducing costs. JiBe ERP enables increased automation and streamlining of processes, creating pre-defined workflows and reducing the usage of email and paper.
JiBe is looking for a top-notch engineer to join our team. You will have an opportunity to directly impact our growing team.
Job Responsibilities
Qualifications
JiBe ERP
Yesterday
As for Site Reliability Engineer you aim to solve operational problems by software and have experience in the following areas:
UST
3 days ago
Dear,
We are hiring for the role of STA Engineer with strong experience in Static Timing Analysis and timing closure. Please find the job details below:
Job Responsibilities:
Required Skills:
Experience:- 3-10 yrs.
Location:- Bangalore
How to Apply:
If interested, please share your updated resume to monikayadav@mirafra.com with the subject line “Application for STA Engineer” .
Mirafra Technologies
2 days ago
Role Overview and Responsibilities:
We are seeking an experienced and highly motivated Mechanisms and R&D Lead Engineer to join our Space Systems Engineering team. In this role, you will lead the design, analysis, and development of advanced mechanical systems and mechanisms for spacecraft and deployers. You will take full ownership of mechanism projects from concept design and prototyping through qualification and flight readiness ensuring technical excellence, reliability, and mission success.
You will also be responsible for guiding a team of engineers, driving design innovation, and ensuring adherence to aerospace standards and best practices across all mechanism development activities.
Key Responsibilities Include, but Are Not Limited To:
Dhruva Space
2 days ago
AI Development Engineer
Experience : 3 to 5 Years
Location : Hyderabad
Notice Period: 30 Days
Job Description:
Robotics Development Engineer to build hardware-accelerated, AI-enabled robotics
software stack for autonomous mobile robots (AMRs) and intelligent robotic systems. In
this role, you will design and implement robotics software that leverages heterogeneous
compute (CPU + GPU + NPU ) to deliver high-performance perception, localization,
planning, and control. You will work across ROS 2, AI pipelines, simulation, and
hardware acceleration, helping bring next-generation autonomous systems from concept
to production.
Preferred Experience:
ACL Digital
3 days ago
Company Description
G-Origin Product Solutions Pvt Ltd is the strategic bridge between India’s high-precision manufacturing sector and the global marketplace. We empower small to medium-sized OEMs and global distributors by delivering high-quality, India-made products with a relentless adherence to our core hierarchy: Safety >> Quality > Delivery > Cost.
We don't just facilitate transactions; we manage the entire lifecycle. From NDAs and quoting to rigorous supplier audits, PPAP, and real-time production tracking, we ensure that "cost advantage" never comes at the expense of "operational excellence." Whether it's custom components or off-the-shelf goods delivered directly to a distributor's door, G-Origin handles the complexity so our partners can focus on growth.
Beyond the balance sheet, we are committed to the future. We actively partner with manufacturers who prioritize sustainability, specifically those integrating solar or wind energy into their production. At G-Origin, we aren't just shipping products; we’re engineering a more reliable, sustainable global supply chain.
Role Description
This full-time, on-site role based in Pune (Kothrud) is for a Sourcing Engineer (Strategic Sourcing). The Sourcing Engineer will manage supplier evaluation, development, and quality assurance processes. The role involves strategizing procurement efforts, ana
G-Origin Product Solutions Pvt Ltd
4 days ago
Scaledge Technology is Hiring: Design Verification Engineer with 5+Years of experience in IP or Subsystem verification.
Location: Bangalore
Share updated resume on sneha.khandelwal@scaledge.io
Requirements:
4–6 years of experience in ASIC/SoC Design Verification.
Experience verifying any of the protocols such as SPI, SMBus/I2C, NAND Flash, JTAG, and AXI.
Strong knowledge of System Verilog and UVM methodology.
Hands-on experience with VIP integration and testbench development. Strong debugging and problem-solving skills.
Ability to work effectively in a collaborative team environment.
Self-driven with good communication and interpersonal skills.
Scaledge Technology
2 days ago
Job Summary
We are seeking a technically strong and enthusiastic Mechanical Engineer – Digital Practice to enhance our BIM capabilities across mechanical disciplines. The primary responsibility of this role is to conduct BIM trainings, develop Revit MEP content, and provide hands-on technical support to project teams for BIM-related challenges.
This role is ideal for mechanical engineers passionate about digital transformation, BIM standardization, and automation within engineering workflows.
Key Roles & Responsibilities
1. BIM Training & Knowledge Development
2. Revit MEP Content Development
3. Project Support & Coordination
Dar
4 days ago
Senior Design Verification Engineer
Job Description:
Experience : 3 to 8 Years.
Location : Bangalore.
ACL Digital
3 days ago
The ideal candidate will be responsible for executing and leading the full design process from ideation to production. You will draw new designs and update our current designs. During the creation process, you will ensure that our designs meeting all necessary design responsibilities.
Responsibilities
Qualifications
TOOGANN TECHNOLOGIES
2 days ago
We are looking for a highly skilled Backend Engineer with 8+ years of experience
in developing backend applications and services using Node.js and Nest.js (TypeScript).
The ideal candidate should be excellent at code reviews, have a strong sense of software
design strategy, and thrive in a fast-paced, collaborative environment. A deep
understanding of database systems, microservices architecture at enterprise scale, and
B2B system design will be highly valued.
Responsibilities:
● Design, develop, and maintain scalable, high-performance backend applications and services using Node.js and Nest.js.
● Write and maintain unit test cases to ensure code coverage and quality assurance.
● Collaborate in peer programming sessions to share knowledge and improve code quality.
● Conduct code reviews to maintain high engineering standards across the team.
● Design and maintain database structures using PostgreSQL and MongoDB.
● Optimize PostgreSQL queries and work with ORMs in TypeScript (e.g., TypeORM, Prisma).
● Work with Kafka MQ for message-driven communication and system integrations.
● Build and maintain containerized applications using Docker and Kubernetes.
● Collaborate with cross-functional teams to ensure seamless integration between frontend and backend systems.
● Contribute to architecture design, system scalability, and performance optimization.
RingCentral
2 days ago
We are looking for a hands-on Security Analyst who takes complete ownership of our security posture — across every device, every server, every application, and every user in our environment.
This is not a monitoring-only role. You will actively assess, identify weaknesses, and prescribe and implement the specific steps needed to fix them. You will be the person who detects threats before they become incidents — including threats that come from inside the organization. You will own endpoint security for our Apple device fleet, harden and audit our cloud and on-premise servers, evaluate our SaaS and internal applications for vulnerabilities, and build the processes that keep us secure as we scale.
If you have done this work hands-on — not supervised it, not theorized about it, but actually built and secured real systems — this role is for you.
We are not looking for someone who generates reports and waits for engineers to action them.
What You’ll Be Responsible For
Confidential Startup SaaS Company
3 days ago
Hiring Verification Engineer
Location- Noida, Ahmedabad, Bangalore, Pune, Hyderabad, Chennai
Please ignore this mail if you are already in interview process with us or working with us.
Job description:
Experience- 5+ years
· Must have expertise in ASIC verification methodologies and ASIC design flow
· Experience working of SV and UVM methodology and knowledge of at least one industry standard protocols like Ethernet, PCIe, MIPI, USB, AXI, RISC-V, AMBA, DDR or similar is required, must have executed at-least 2 SoC Verification projects
· Experience in any of the listed topics: UVM, formal verification, mixed-signal simulations, power-aware simulations
· Experience in setting up and debugging functional and/or gate-level simulations
· Experience in translating functional requirements into verification plans
· Experience in developing verification environment and regression setup.
· Coverage analysis and closure
<<✍Share resume at medha.gaur@einfochips.com>>
eInfochips (An Arrow Company)
5 days ago
Ascendion is looking for an experienced Python Data Engineer who is working on data platforms and who has worked on big-data platforms projects.
Required Skills & Experience:
Must have/non-negotiable skills:
About Us:
Ascendion is a leading provider of AI-first software engineering services. Our applied AI, software engineering, cloud, data, experience design, and talent transformation capabilities accelerate innovation for Global 2000 clients. Ascendion is headquartered in New Jersey. In addition to our remote/hybrid workforce, we have 30+ offices across the U.S., UK, Poland, Romania, India, Australia and Mexico. We are committed to building technology powered by Generative AI with an inclusive workforce, service to our communities, and a vibrant culture. For more information, please go to
Ascendion
5 days ago
Job Description:
We are seeking an experienced AI Engineer with a strong background in Natural Language Understanding (NLU) who is passionate about pushing the boundaries of Conversational AI. In this role, you will design, develop, and deploy scalable AI solutions leveraging LLMs, Retrieval-Augmented Generation (RAG), and prompt engineering techniques to power intelligent products and services.
As part of our ML/AI team, you’ll own the full lifecycle of model development — from data preparation and fine-tuning to inference optimization and deployment in production environments.
Responsibilities:
RingCentral
5 days ago
Evolving Systems Networks India Limited.
Job Title: Production Support Engineer (MS – Airtel)
Job Location: Remote, Delhi based.
Required Experience: 1–3 years
Who we are:
Evolving Systems provides software solutions and services to telecommunications
operators (communication service providers, or CSPs) worldwide. The company’s
portfolio includes service activation, SIM and eSIM lifecycle management, number
and asset management, customer value and loyalty management, and digital
engagement solutions. Evolving Systems works with over 100 telecom operators
worldwide, with deployments across five continents, supporting services that reach
more than one billion consumers globally.
Website :
Key Responsibilities:
• Provide Level 1 / Level 2 production support for applications deployed at customer
sites
• Monitor production systems and proactively identify, investigate, and resolve
incidents
• Perform root cause analysis and assist in permanent fix implementation for
recurring issues
• Execute shell scripts and SQL queries for troubleshooting, data validation, and
operational tasks
• Support Oracle database-related activities, including basic performance checks
Evolving Systems
2 days ago
PHYSICAL DESIGN ENGINEER/LEAD - NOIDA/Ahmedabad/Bangalore/Hyderabad/Pune
(NO WORK FROM HOME OR REMOTE WORK)
Job description: 4 to 15 Years of Hands on Experience
NO WORK FROM HOME.
eInfochips (An Arrow Company)
4 days ago
Required skills/experience :
What you will do:
Cashfree Payments
4 days ago
About the Company
Cadence Bangalore is Hiring for Senior Principal Signal Integrity engineer.
About the Role
This is a unique opportunity to join the HPP IP R&D Group at Cadence Design Systems. We are looking for a Sr Principal SI/PI Engineer who will be a key contributor to our advanced high speed IP products. This is a hands-on technical position.
Responsibilities
Qualifications
Cadence
3 days ago
We are looking for a Physical Design Engineer to work on end‑to‑end ASIC/SoC block implementation.
Key Responsibilities:
Experience: 4+ years
Location: Hyderabad/Bangalore/Noida/Ahmedabad/Pune/Chennai (On-site only)
Role: Physical Design Engineer
eInfochips (An Arrow Company)
3 days ago
Frontend Engineer (React)
Location: Bengaluru (In-Office)
Experience: 3–5 Years
Type: Full-time
At Persistence , we’re building AI voice agents for businesses, turning product knowledge into fast, reliable, always-on customer experiences. We’re a fast-moving startup focused on building modern, scalable products, and we’re looking for a Frontend Engineer who cares deeply about performance, clean code, and great UX.
You’ll work closely with backend engineers and product teams to build fast, reliable, and beautiful user interfaces that power real products.
What you’ll do
What we’re looking for
Persistence
2 days ago
We are seeking a highly driven Senior Design Verification Engineer for RTL functional verification of cutting-edge in‑vehicle networking devices within NXP’s next‑generation automotive product line. In this individual‑contributor role, you will architect, enhance, and maintain advanced UVM‑based and C‑based verification environments. You will define robust verification strategies, craft comprehensive test plans, and drive metric‑driven verification to full closure. You will collaborate closely with world‑class teams across Design, Architecture, Validation, and Firmware, ensuring seamless integration and adherence to rigorous automotive design and quality processes. This is an opportunity to shape the verification strategy of high‑impact automotive solutions that define reliability, safety, and performance for vehicles worldwide
Main Job Tasks and Responsibilities:
NXP Semiconductors
3 days ago
About CIBC India:
CIBC India is a technology and operations hub in Hyderabad, where you’ll be part of our highly engaged and global team, and play a central role in supporting our growth. Whether you’re driving innovation in digital banking or streamlining client onboarding, you’ll be part of a culture recognized for excellence and investment in people. At CIBC India, people and progress are at the center of what we do: you’ll develop skills, collaborate with industry leaders, and see your ideas realized in a culture that values progress and belonging. We provide cutting-edge resources, ongoing learning, and a supportive environment where people come first. If you’re ready to create meaningful change and build your future, CIBC India is where your ambition meets opportunity.
What You’ll Be Doing (Position Summary):
The Quality Engineer is responsible for developing and executing manual and automated test scenarios and scripts for system integration and acceptance testing of banking platforms, in compliance with established controls and standards The role collaborates with business analyst and development teams to ensure quality deliverables, analyzes test results, and provides actionable insights to stakeholders The Quality Engineer also contributes to process improvement initiatives and maintains quality metrics, supporting the continuous improvement of testing practices within the organization
At CIBC India we enable the
CIBC India
2 days ago
What you will be doing
We are looking for a Data Scientist who can operate at the intersection of classical machine learning , large-scale recommendation systems , and modern agentic AI systems .
You will design, build, and deploy intelligent systems that power Glance’s personalized lock screen and live entertainment experiences. This role blends deep ML craftsmanship with forward-looking innovation in autonomous/agentic systems.
Your responsibilities will include:
Classical ML & Recommendation Systems
Agentic Systems & Next-Gen AI
Glance
3 days ago
Goodscore is on a mission to make financial literacy and credit access simple, actionable, and inclusive for over 150 million Indians who are credit-underserved. From starting personalized credit consulting in 2023 to now empowering over 5 million users to take control of their credit health, we are one of India’s fastest-growing credit score & credit management apps. We are backed by leading VC funds in India, consistently ranked among the top 10 finance apps in the country, and are currently on a 1x to 100x growth journey. We’re building a high-impact team that shares our vision of transforming how India understands and manages credit and this is only the beginning.
Few resources and highlights that may be helpful for a quick introduction to Goodscore:
About the Role:
We’re looking for a Backend Engineers who love building fast, scalable and intuitive user interfaces. You’ll work closely with product, design and frontend teams to translate ideas into high-quality user experiences that directly impact business outcomes. This role is ideal for someone who enjoys owning features end-to-end, writing clean code and shipping fast in a high-growth environment.
Key Responsibilities:
GoodScore
3 days ago
We are looking for a dynamic Project Engineer who can drive projects from concept to completion with precision, efficiency, and a problem-solving mindset
Kindly Note - This position will be on Third Party Payroll and Location will be at Mahape, Navi Mumbai
Job Responsibilities: -
Project Management
TotalEnergies
3 days ago
Company Description
Atharva Infotech is a forward-thinking software company specializing in open, enterprise-wide business solutions for the travel industry, built on .NET technology. Our solutions are flexible, modular, and seamlessly integrate with existing systems, offering users full internet-enabled, web browser access. Continuously evolving, our products address the unique requirements of modern tour operators, whether they offer pre-packaged, specialist, or dynamically assembled travel options. With a strong focus on customer-centric design, Atharva Infotech delivers cutting-edge E-Travel technology to optimize the marketing and sale of holiday products.
Job Summary: We are looking for a Development Support Engineer to assist the development team in application support, issue analysis, bug fixing, and coordination between QA, product, clients and development teams. The role involves troubleshooting production issues, supporting deployments, and ensuring smooth day-to-day technical operations.
Key Responsibilities:
Atharva Infotech Pvt. Ltd.
Yesterday
Job Title: Mechanical Design Engineer
Language Requirement: Japanese Proficiency - JLPT N3 or N2 & above
Job Description
About Creative Synergies Group:
Creative Synergies Group
Yesterday
About CIBC India:
CIBC India is a technology and operations hub in Hyderabad, where you’ll be part of our highly engaged and global team, and play a central role in supporting our growth. Whether you’re driving innovation in digital banking or streamlining client onboarding, you’ll be part of a culture recognized for excellence and investment in people. At CIBC India, people and progress are at the center of what we do: you’ll develop skills, collaborate with industry leaders, and see your ideas realized in a culture that values progress and belonging. We provide cutting-edge resources, ongoing learning, and a supportive environment where people come first. If you’re ready to create meaningful change and build your future, CIBC India is where your ambition meets opportunity.
What You’ll Be Doing (position summary):
The Intermediate Quality Engineer leads testing activities within CIBC’s Core Banking and Payments Technology Quality Engineering team This role is responsible for developing and maintaining automated test scripts, creating comprehensive test plans and schedules, and ensuring all testing processes meet quality standards and project timelines The incumbent works closely with project managers, developers, quality engineers, and business stakeholders to deliver high-quality solutions and continuous improvement in testing methodologies
At CIBC India we enable the work environment most optimal for you to thrive in you
CIBC India
5 days ago
Job Summary:
We are seeking a highly skilled Electrical Design Engineer with experience in the USA (AEC industry). The ideal candidate should have strong technical expertise in SKM or ETAP and possess excellent communication skills.
Key Responsibilities:
Requirements:
Preferred Q
Uppteam
4 days ago
Python-focused AI Backend Engineer with strong experience building production-grade FastAPI services and AI agent workflows
Must Have:
Backend Engineering
Retrieval-Augmented Generation (RAG)
AI Agents & Frameworks
Data & Storage
Ways of Working
Nice to Hav
Insight Global
5 days ago
The role is responsible for managing plant-quality concerns, driving supplier-related issue resolution, supporting NPI builds, and ensuring uninterrupted production by executing effective containment, corrective actions, and preventive mechanisms.
Key Responsibilities:
Husco
3 days ago
Mandatory skills: Network Engineer.
Notice –Immediate or max 1 week –if more than this, don’t share
Shift - US support hours in any shift as per below
Crown Solutions
Yesterday
Job Title: Structural Design Engineer – Steel structures
Experience: 4+ years | Location: Bangalore | Type: Full-Time
Role Overview
We are looking for an experienced Structural Design Engineer to expand our in-house capability of steel building design. The role involves structural analysis, design optimization, detailing coordination of steel structures and support RC design as well. The ideal candidate should have strong fundamentals, sound engineering judgement, and experience in delivering efficient & safe steel designs in addition to ease of construction.
Key Responsibilities
Total Environment Building Systems Pvt Ltd
5 days ago
The ideal candidate is a critical thinker who will proactively improve the quality systems of the company through internal audits, participation in process improvement programs, and interaction with key partners to obtain constructive feedback.
Responsibilities
Qualifications
GMS ELEGANT BUILDERS (I) PVT LTD
4 days ago
About SmartWinnr
SmartWinnr is an enterprise-grade sales enablement platform that helps organizations build sales readiness through AI sales roleplay, coaching, and certification at scale. The platform serves Fortune 500 clients including Boston Scientific, Johnson & Johnson, and AstraZeneca, along with other major companies such as Abbott, Medtronic, HDFC Life, SBI Life, ICICI Prudential, and Aditya Birla to boost sales productivity. SmartWinnr has earned recognition from G2 as one of the top 50 sales products for two consecutive years, based on authentic customer reviews and feedback.
Job Role : Pre-Sales Solutions Engineer
Job Location : Hyderabad (on -site)
Role Description
As a Pre-Sales Solutions Engineer, you will play a critical role in supporting the sales function by delivering impactful product demonstrations and assisting in solution positioning for prospective customers across the APAC region. You will work closely with the sales, product, and customer-facing teams to ensure a seamless and effective pre-sales experience.
Responsibilities include but are not limited to:
• Conduct product demonstrations and presentations tailored to customer requirements and business use cases
• Collaborate closely with the sales team to understand prospect needs and propose appropriate product solutions
• Assist in cr
SmartWinnr
Yesterday
Job Title: Electrical Design Engineer
Location: Gujarat (Preferred)
Experience: 8-10 Years
Job Description:
We are seeking a highly skilled and detail-oriented Electrical Design Engineer with 9-10 years of experience in the design field , preferably in the industrial segment . The ideal candidate should have a strong technical background in electrical system design, calculations, and layout preparation.
Key Responsibilities:
VMS
4 days ago
About PANSOFT Services
PANSOFT Services is an engineering solutions company specializing in Product Design, CAE (FEA/CFD), and Engineering Automation .
We partner with global clients through a dedicated engineering team model , where our engineers work directly on real-time product development projects, ensuring high-quality and scalable engineering delivery.
Role Overview
We are looking for a Mechanical Design Engineer with hands-on experience in drafting, sheet metal design, and container development .
This role involves working as part of a client-deployed team , contributing to live engineering projects , including battery pack container-level design and mechanical enclosures .
Key Responsibilities
PANSOFT Services (PTS)
3 days ago
Memory Circuit Design Verification Engineer, will contribute to the verification of advanced DRAM and HBM memory products.
Work closely with a dynamic team to ensure timely delivery of functionally correct designs, collaborating with global teams and leveraging state-of-the-art verification methodologies.
Key Responsibilities
Perform FastSpice-based verification (Primesim, FineSim, SpectreFx) for DRAM critical timings and full-chip analog simulation across PVTs and speed grades.
Develop, maintain, and execute System Verilog/UVM-based testbenches and verification environments for full-chip validation.
Analyze and debug pre-silicon full-chip designs, focusing on timing closure and functional coverage.
Collaborate and communicate effectively with cross-geo teams (e.g., US) to align verification flows and share technical insights.
Document verification results, report issues, and contribute to continuous improvement of verification methodologies.
Must-Have Skills
Hands-on experience with Fast Spice simulators for analog and mixed-signal verification.
Proficiency in System Verilog/UVM-based verification methodology.
Strong understanding of DRAM memory architecture and timing analysis.
Secondary Skills
Scripting skills in Python or Tcl for automation and regression analysis.
Familiarity with
Mirafra Technologies
2 days ago
We are looking for talented professionals with strong expertise in Generative AI, Agentic AI systems, and Machine Learning to design and build intelligent, scalable, and production-grade AI solutions. This includes working on LLM-based applications, multi-agent systems, and end-to-end ML pipelines .
Key Responsibilities:
1. Machine Learning & Statistical Modelling
2. GenAI / LLM Application Development
3. Agentic Systems & MCP
ALTEN India
3 days ago
A senior technical engineering role responsible for designing and building scalable, secure, and high‑performance data platforms. This role brings deep expertise in data engineering, modern data architectures, and AI‑enabled data solutions. You will partner closely with architecture and cross‑functional teams to translate business needs into robust, future‑ready data systems.
Key Responsibilities & Skills:
H&R Block
3 days ago
About RedCompass Labs:
We are global experts in payments modernization, using the latest applied AI and deep payments expertise to deliver end-to-end payment transformation projects. Our solutions help clients accelerate their payments modernization programs while reducing costs and regulatory risk.
The RedFlag Accelerator sets the industry standard for detecting human trafficking related crime, assisting both financial institutions and law enforcement. Using a persona-oriented approach, it identifies the actors behind transactions, including victims, money-laundering professionals, perpetrators, and others. By combining AI, the latest red flags, crime models, and OSINT data, our system helps detects crimes across labor, sex trafficking and child sexual exploitation.
With offices in London, Warsaw, Brussels, Miami, Toronto, and Pune, we support clients worldwide. To learn more about RedCompass Labs and our services, visit .
Role Overview
Total Exp.: 5+ years
Job Location: Baner,Pune
We are looking for a talented and motivated Manual QA Engineer with deep payments domain expertise to join our growing Pune team. This role is ideal for a payments professional who has built strong QA experience across global payment schemes and is looking to grow into a Business Analyst role over time.
You will work on complex, high-impact payments transformation programmes for tier
RedCompass Labs
Yesterday
We are hiring a Low-power SoC DV Engineer with strong expertise in UPF-based verification. This role focuses on full-chip / sub-system verification, power intent validation, and advanced debugging.
Join a team where your low-power verification expertise directly influences silicon quality and design robustness.
Best,
Karthik Kumar
karthik.adasu@Proxelera.com
Proxelera
3 days ago
About Turing:
Based in San Francisco, California, Turing is the world’s leading research accelerator for frontier AI labs and a trusted partner for global enterprises deploying advanced AI systems. Turing supports customers in two ways: first, by accelerating frontier research with high-quality data, advanced training pipelines, plus top AI researchers who specialize in coding, reasoning, STEM, multilinguality, multimodality, and agents; and second, by applying that expertise to help enterprises transform AI from proof of concept into proprietary intelligence with systems that perform reliably, deliver measurable impact, and drive lasting results on the P&L
Job Overview:
An Electrical Engineering Pod Lead responsible for managing a small team creating simulation-based design problems used to train and evaluate advanced AI models. The role owns problem quality, reproducible environments, and coordination with AI researchers , and requires strong EE domain depth plus hands-on Python, Docker, and basic cloud skills to review and support the pod end-to-end.
Key Responsibilities:
Turing
Yesterday
Commonwealth Bank
Today
Fairgrowth Management Consultants Pvt. Ltd.
Today
Uppteam
Today
Calsoft
Today
Kushi Civil Structural consultancy Pvt Ltd
Today
Synopsys Inc
Today
Pantum
Today
Intellias
Today
Raksha Elite Solutions
Today
Sweco
Today
Simple Logic IT Private Limited
Today
HR PLACEMENT CONSULTANTS (HRPC)
Today
Intellias
Today
Empower
Today
SINCLUS
Today
Sweco
Today
Empower
Today
SINCLUS
Today
Empower
Today
Calsoft
Today
FM
Today
Gateway Search
Today
Wayzim Technology Co., Ltd.
Today
H&P
Today
Synopsys Inc
Today
FM
Today
FM
Today
InRhythm
Yesterday
GTT
Yesterday
Simple Logic IT Private Limited
Yesterday
Gateway Search
Yesterday
Kushi Civil Structural consultancy Pvt Ltd
Yesterday
Synopsys Inc
Yesterday
H&P
Yesterday
Synopsys Inc
Yesterday
Sweco
Yesterday
Commonwealth Bank
Yesterday
Mulya Technologies
Yesterday
Intellias
Yesterday
Raksha Elite Solutions
Yesterday
Calsoft
Yesterday
Intellias
Yesterday
Pantum
Yesterday
Uppteam
Yesterday
HR PLACEMENT CONSULTANTS (HRPC)
Yesterday
FM
Yesterday
SINCLUS
Yesterday
Calsoft
Yesterday
Wayzim Technology Co., Ltd.
Yesterday
Raksha Elite Solutions
Yesterday
Senvion India
Today
TÜV SÜD
Today
Tecnicas Reunidas
Today
Gamma Technologies
Today
YANMAR India
Today
Polycab India Limited
Today
Multi Nano Sense Technologies Private Limited
Today
Larsen Toubro Vyoma
Today
Zef Scientific India
Today
LT Sargent Lundy Limited
Today
Saroj Iron Industries
Today
Saroj Iron Industries
Today
Qpisemi
Today
Dar
Today
ambience
Today
G P Group Pvt Ltd
Today
INI Design Studio
Today
The Indian Hume Pipe Co. Ltd.
Today
G Origin Product Solutions Pvt Ltd
Today
YANMAR India
Today
MGI
Today
MGI
Today
Tata Consulting Engineers
Today
GMS ELEGANT BUILDERS I PVT LTD
Today
Integrated Personnel Services Limited
Today
Seven Eleven Club Hotels
Today
Manipal Technologies Limited
Today
Manipal Technologies Limited
Today
Peoplefy
Today
Pocket FM
Today
Tata Electronics
Today
DCM Shriram Chemicals
Today
Reidius Infra
Today
KPI Partners
Today
River
Today
IDOM Consulting Engineering Architecture
Today
Hi Tech Engineers
Today
Mindtickle
Today
Dar
Today
BW Design Group
Today
THE BHARAT STEEL GROUP
Today
WSP in India
Today
Tuebora Inc
Today
ME Engineers
Today
Texas Instruments
Today
Cadence
Today
MSE Technology
Today
Wenger Watson
Today
Proxelera
Today
Mclin Aqua Consultants LLP
Today