Autoliv’s global BI, Data Analytics Team and Data Science Team is looking for a Senior Data Engineer to help create new values for Autoliv. You will work with managing large datasets and complex data environments to obtain actionable insights from our corporate data. This role will be focused on designing, implementing, and enhancing data architectures for ingesting and processing, including making large data sets available for BI and advanced analytics – all driven by the Microsoft Azure Platform.
This is a perfect opportunity for a candidate to become a part of a growing team that develops analysis tools for all areas of our global business – and for a person who enjoys working in a collaborative and agile environment. This position will require a results-driven individual who has a passion for data/analytics and who can work collaboratively with others to solve key business problems and drive business growth and data insights.
Job Description (Responsibilities & Duties)
Develop data architectures and data models for data ingestion and processing.
Engineer data flow and data ingestion tools, and techniques to support big data.
Work with both structured and unstructured data to build data models & data schemas
- “Wrangle” large data sets to obtain actionable insights from them.
- Work with global BI, ETL, and Visualization experts and tool sets to process & manage data.
Ensure quality and validity of data through data architecture and related best practices.
Support the growth and scalability of data pipelines, data lakes, and data warehousing.
Experience & Education
Minimum Bachelors or higher in Computer Science, Data Sciences, Data Engineering, or equivalent.
5+ years of industry experience as data architect, engineer, statistician, or similar position working with ETL, ELT, CDC, data processing, database programming and data analytics tools and processes.
Spoken and Written English skills required
Data Vault 1.0 Certification preferred (but not required)
Azure Data Structures and Technologies (SQL, Data Lake Services, Data Warehousing Services, Analytics Services, AI/ML Workbench, Data Bricks, etc.)
Snowflake Enterprise Data Warehouse (Snowflake Database, SnowPipe, SnowSQL)
Experience with using Automation and DEVOPS tools such as Azure DevOps, GitHub, UIPATH, Ansible, etc.
- Ability to architect various data structures and common methods in data transformation (3NF, Data Vault, Data Marts, etc.)
- Able to architect large, complex data sets (Spark, Hive, Hbase, Hadoop, Oracle, Teradata, HDFS, SAS, etc.)
- Experience with Statistical scripting/programming languages (R, Python, JSON), and REST based API development
Excellent written and verbal communication and presentation skills (English) required.
- Ability to communicate and with BI, ETL, Data Analytics and other IT team members.
- Ability to present technical findings in a clear and easy to understand manner to a non-technical audience
- Enjoy mathematical modeling, statistical mathematics, predictive modeling and pattern recognition.
- Bring a strong enthusiasm, team oriented, and entrepreneurial way of thinking.
- Familiar with the agile software development practice
- Ability to apply agile working methods to projects and work
Project Management experience is a plus, but not required