WHAT TO EXPECT
We are looking for a Senior Big Data Engineer, with proven experience in designing, building and managing complex data solutions for a variety of voluminous data arriving at a very fast pace.
Your responsibility will be to design and implement streaming data pipelines for Big Data systems. You will be able to design and refine data models for the complex databases, both relational SQL and NoSQL, also having the ability to translate business problems into data analytics requirements and solutions that deliver value through analysing and processing data, building, and maintaining models and report templates, and developing dynamic, data-driven solutions.
This role provides a rare opportunity to support and grow an entirely new data capability within Jaguar Land Rovers Remote Connected Vehicle Diagnostics team. You will form an essential part of a collaborative, iterative and agile process within a team of designers, software, DevOps and quality engineers helping deliver best in class digital solutions to the business and its customers.
Key Accountabilities and Responsibilities
* Monitoring and providing insight into the changing data storage and usage requirements for Remote Connected Vehicle Diagnostics to maintain integrity and security of the data
* Working collaboratively with our software development teams and Enterprise Architects to define sophisticated tools and techniques for building, monitoring and maintaining data
* Supporting developers to utilise data pipelines and databases, sharing best practices
WHAT YOU'LL NEED
You will need strong experience in streaming technologies such as Kafka, Apache Beam, Apache Spark and strong exposure to working with large data sets and Big Data technologies. You must have the ability to design, create, test, and maintain data pipelines to support analytics projects. Having the ability to identify, design and implement processes of improvements in data pipelines is essential. A strong understanding of non-relational data modelling and key distributed database concepts such as partitioning, sharding and clustering are of high importance.
Knowledge, Skills and Experience
* Previous experience and knowledge of modern programming languages and scripts
* You will require strong experience in Python and Java development
* Experience with cloud data environments such as GCP & AWS
* Working knowledge of data warehousing technologies such as BigQuery & Redshift
* Experience of working in an Agile/XP/Scrum team and familiarity with associated techniques
* Familiarity with orchestration frameworks such as Apache Airflow, Luigi & Apache NiFi