Ecolab is actively recruiting a data analyst to work with analytics & digital team within company read more to know about this opening
About Company
Ecolab Company has an opening for a Logistics Data Engineer in our Procurement Analytics and Digital Solutions group located in Pune, IN. If you are a passionate professional that is seeking opportunity, advancement, and a rewarding career we invite you to apply. This is a fantastic opportunity to join a highly recognized global growth company offering competitive compensation and benefits in addition to career growth.
The individual will work with the Digital Capabilities team and will be responsible for designing, developing, and maintaining data pipelines and infrastructure to support our logistics procurement analytics operations. This role will involve working closely with cross-functional teams to ensure data accuracy, reliability, and availability for analysis and reporting purposes. The ideal candidate should possess strong technical skills and a passion for working with data. You see things a little differently. So do we. Our Analytics & Digital team (ADS) is responsible for driving innovation via automation, strategic business insights and process optimization.
For tech job alert !! Join here
Ecolab Is Hiring Over 648+ Entry Leval ( 0-2 Year exp.) Data Analyst In Pune
Job ID | 54162 |
Job Role | Data Analyst |
Education Qualifications | BE/B.Tech or Other Tech Degree |
Salary Package | 4 to 6 LPA or as per company standards |
Experience | Fresher |
Job Location | Pune , Maharashtra |
Last apply date | ASAP |
Attention : Read all criteria carefully and make changes in resume by adding relevent keywords so you will get the interview call otherwise you will loose opportunity.
Job Overview
We are seeking a talented and experienced Logistics Data Engineer to join our team. The Logistics Data Engineer will be responsible for designing, developing, and maintaining data pipelines and infrastructure to support our logistics analytics operations. This role will involve working closely with cross-functional teams to ensure data accuracy, reliability, and availability for analysis and reporting purposes.
Qualifications
- Bachelor’s degree in computer science, Information Systems, or Engineering. Advanced degree is a plus.
- Strong programming skills in languages such as Python, SQL. Experience with data integration tools and frameworks is desirable.
- Exposure to Azure / AWS / FIVETRAN cloud service platforms.
- Exposure to Snowflake
- Exposure to SAP tables and logistic module
- Proficiency in data modeling and database design principles.
- Familiarity with ETL processes, data warehousing concepts, and business intelligence tools.
- Understanding of logistics operations, supply chain management, and related data domains.
- Excellent analytical and problem-solving skills with attention to detail.
- Strong communication and collaboration skills to work effectively with cross-functional teams.
- Ability to adapt to a fast-paced environment and handle multiple projects simultaneously.
Data Science Remote Internship For Fresher
Key Responsibilities
- Develop and maintain new data pipelines using Azure Data Factory: Design, build, and maintain scalable and efficient data pipelines to extract, transform, and load (ETL) data from various sources into our data warehouse. This includes handling large volumes of structured and unstructured data related to logistics operations.
- Data modeling and architecture: Collaborate with stakeholders to understand logistics data requirements and design efficient data models and architecture that support analytics and reporting needs. Ensure data integrity, accuracy, and consistency throughout the data lifecycle.
- Data quality and validation: Implement data validation processes and quality checks to identify and resolve data anomalies, inconsistencies, and errors. Work closely with business users to understand data issues and provide timely resolutions.
- Data integration and API development: Integrate data from external systems, vendors, and partners through APIs or other data integration methods. Develop and maintain data interfaces to enable seamless data exchange between systems.
- Performance optimization: Identify and implement strategies to optimize data processing and query performance to ensure timely data availability for analytics and reporting purposes. Monitor and fine-tune the performance of data pipelines to maintain efficiency and reliability.
- Documentation and knowledge sharing: Document data engineering processes, data models, and technical specifications. Share knowledge and best practices with the team to foster a collaborative and data-driven culture.
- Collaboration and cross-functional support: Collaborate with cross-functional teams, including logistics operations, business analysts, data scientists, and IT, to understand their data needs and provide technical support and guidance. Participate in project discussions and contribute to data-driven decision-making processes. Lead the development of new data products empowering the building of new Logistics Analytics.
Honeywell Careers 2024 : Need Graduate Data Analyst In Pune Office ( 40K/Month )