Big Data Developer (Ops)

  • Vjal Portomaso, St Julian's, Malta
  • Full-time

Company Description

Tipico is the #1 sports betting company in Germany, both online and through our retail network.

The Data Tribe++ covers several areas and technology aimed at driving Tipico’s data strategy through the whole group. These are covered with sub teams managing their own area, from modern data platform, business intelligence and advance analytics, data Governance & GDPR, Marketing Technology, Data Science / AI  and service management (Customer service tooling & Retail Tooling). This is all aimed at supporting the business in the day to day operations while also facilitating data driven decision-making and enhancing customer experience to provide value through data. The Data Team's vision is “DATA AT THE CENTER OF EVERYTHING WE DO” and this is something we live by. The team reports into product and technology under the Head of Data & BI which then supports the organisation's internal customers and external suppliers. This spans from Management, CRM, Acquisition, Finance, Retail, bookmaking, Customer service, Casino, Payments,  Legal, Digital to Compliance among others.

Job Description

Are you excited about Hadoop based data technologies?

Are you passionate about tech like Spark, Storm, Hive, Pig, Cassandra, Kafka, HBase?

Do you enjoy the challenge of structuring, organising and optimising large amounts of data?

Are you a pragmatic self-motivated team player that understands the business context?

What you will do:

  • Continuously improve the performance of the application as well as the data structures.
  • Linux configuration, deployment, and troubleshooting
  • Hardware Load Balancers, Network configuration, Firewalls and troubleshooting
  • Manage the day-to-day running of Realtime Environments on AWS infrastructure
  • Stay on top of evolving technology providing efficient design and improvements of the data architecture
  • Assure data quality and consistency of produced data applications.
  • Provide hardware architectural guidance, planning, estimating cluster capacity, and creating roadmaps
  • Help us drive our best in class modern data platform

Qualifications

What you offer:

  • Minimum of a B.S. in computer science or equivalent work experience
  • 2+ years’ experience using the Hadoop ecosystem to solve large-scale problems
  • Production experience with: Jenkins, Docker, GIT, Talend BigData technologies (Hadoop, HBase, ElasticSearch, Spark Streaming, Kafka, CDH is a +) any of the above are an asset
  • Relational DB / SQL experience (mysql, sql server, oracle etc.. )
  • REST, microservices, multi-data center distributed architecture, support and planning
  • Knowledge of Java, Scala or Python
  • Incident management, reporting, communication, production feedback process
  • Security, PK, JWT and related standard methodologies
  • Natural influencer, and selfless teammate, willing to help your team achieve great things
  • Excellent problem solving skills; proven technical leadership and interpersonal skills
  • Methodical and ability to Simplify and Refine
  • Continuous learner

 

Additional Information

What we offer

  • Agile and multicultural company with flat hierarchies
  • Self-organised, self-responsible and entrepreneurial employees 
  • Competitive salary, Health and Dental Insurance, Performance Bonuses, Subsidised Parking, Sports incentives & Childcare 
  • Opportunities to develop and grow
  • Relocation Assistance 
  • Office sea-views, social events, healthy treats, kitchen on every floor

Videos To Watch

Privacy Policy