Key Responsibilities: Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into scalable data solutions. Design and develop robust data pipelines, ETL processes, and data integration frameworks to collect, transform, and store large volumes of structured and unstructured data from various sources. Implement and maintain data warehouses, data lakes, and data marts to support reporting, analytics, and business intelligence initiatives. Identify, assess, and incorporate new tools, technologies, and best practices to drive continuous improvement in data engineering processes. Ensure the integrity, security, and privacy of data by implementing appropriate data governance standards and procedures. Perform data profiling, cleansing, validation, and enrichment activities to optimize the quality and usability of data. Collaborate with infrastructure teams to ensure optimal performance, scalability, and availability of the data infrastructure. Monitor data pipelines, identify performance bottlenecks, and troubleshoot issues to ensure smooth and uninterrupted data flow. Document data engineering processes, data models, and architectural decisions to facilitate knowledge sharing and enable effective collaboration. Stay updated with emerging trends, technologies, and industry practices in the data engineering domain, and provide recommendations for improvements and enhancements. Qualifications and Skills: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 3+ years of experience as a Data Engineer or in a similar role, preferably within the logistics industry. Strong proficiency in SQL and experience working with relational databases (e.g., MySQL, PostgreSQL) and big data technologies (e.g., Hadoop, Spark). Proficiency with Python and familiarity with data processing frameworks (e.g., Apache Beam, Apache Flink). Experience with data modeling, data integration, and data warehousing concepts and technologies (e.g., Snowflake, Redshift). Solid understanding of ETL processes, workflow orchestration tools (e.g., Airflow, Luigi), and version control systems (e.g., Git). Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and working knowledge of containerization technologies (e.g., Docker, Kubernetes). Strong analytical, problem-solving, and critical-thinking skills with a keen attention to detail. Excellent communication and collaboration skills, with the ability to explain complex technical concepts to non-technical stakeholders. If this outstanding opportunity sounds like your next career move, please send your resume in Word format to Wiki Wong at
[email protected] and put Data Engineer - Global Logistics Company in the subject header. Data provided is for recruitment purposes only. _________________________________________________________ Headquartered in Hong Kong, Pinpoint Asia is the go-to Specialist Firm for Technology Recruitment We are a team of specialist tech recruiters (many of our recruiters come from an IT background) and we serve a wide range of clients, all the way from tech startups (especially FinTech) to some of the top Financial Institutions on Wall Street and several other large scale enterprises in other industries. Our significant market reputation and status as the leading search firm for many of our clients is a direct result of our strong industry relationships, intimate understanding of the marketplace and proven ability to deliver results. Our vision is to help companies hire smarter and help job seekers get closer to their career aspirations. To see all our open jobs please reach out to us at https://pinpointasia.com/job-search/ (EA License #72371) We are also seeking top-calibre candidates for the following exciting roles: 1) Full Stack .NET Senior Developer - Digital Insurance Platform 2) Mobile Tech Business Analyst - Private Wealth Management 3) Java Solidity Blockchain Software Engineer - Leading FinTech Group