Summary: As an experienced Big Data Engineer, help us build the next generation of data technology at Macquarie within our Banking & Financial Services business.
Description:
As a Big Data Engineer, you will be part of an exciting technology team working on a cutting-edge platform. We need an experienced Data Engineer to help us build the next generation of data technology within our Banking & Financial Services business group. You will join a cross-functional Agile team with a diverse range of skills which you'll be able to leverage to develop your own T-shaped skill-set and grow your career.
You’ll bring your in-depth knowledge of big data technologies best practice and a desire to work in a DevOps environment where you have end-to-end accountability for developing, deploying and supporting your data assets.
You will have some of the following skills and experience, with a preference for Scala / Java and distributed platform capabilities.
Your key responsibilities will include:
- data Warehousing/ETL concepts or have worked on similar projects
- strong Linux/Unix skills
- strong SQL
- experience with AWS or other cloud environments
- demonstrated experience using distributed computer frameworks on Hadoop, Spark, distributed SQL, and/or noSQL query engines
- significant experience in Scala, Java, and/or Python
- Big Data querying tools like Hive or Presto.
Ideally you will also bring:
- deep understanding of AWS cloud architecture
- exposure to Data Vault 2.0
- talend preferred, but not required
- experience working with Datastax Cassandra implementation
- security Concepts and best practices
- data Pipelines tools like Apache oozie, LUIGI or Airflow
- maven, SBT.