Software Engineer - ETL

About you:
  • You're comfortable with Python (we're looking for over a year of professional experience),
  • You're confident with Linux and the command line.
  • You're keen to work with large scale production systems which span the globe (2 data centres consuming over 9000 GB of RAM, peak Kafka throughput of between 500k to 1 million messages per second).
  • You're looking to apply best practises of software engineering in development and operations.
  • You never say no to learning.
What you'll do: 
  • Develop software - stream processors, a large collection of internal ETL tasks, projects assisting the dev and analytics teams, event logging modules, and more.
  • Learn infrastructure management - writing Salt states, updating our Terraform provisioning automation, working with Google Cloud Platform, running services on our Mesos cluster.
  • Instrument the infrastructure to provide the team with situational awareness.
  • Help ensure we meet our internal SLOs for services.
We work with:
  • Programming languages: Python 3 (and PyPy), Clojure, Ruby, Go
  • ETL: Kafka, Celery, ElasticSearch, Logstash, Kibana
  • Databases: MySQL, Postgres, Cassandra, BigQuery, Redis
  • Containerisation: Docker, Mesos, Marathon
  • Infrastructure: Google Cloud Platform, Salt, Terraform
  • Monitoring: Datadog, Sentry, Jaeger
  • CI: Drone

For more information on how we use your data as an applicants, please click here -