Data Engineer, TIDAL
- Oslo, Norway
- Current Square Employee?: Apply via go/jobs with your Square email.
TIDAL is an artist-centric global music and entertainment platform that brings artists and fans closer together through unique original content and exclusive events. Available in 61 countries, the streaming service has more than 70 million songs and 250,000 high-quality videos in its catalog along with original video series, podcasts, thousands of expertly curated playlists and artist discovery via TIDAL Rising.
With the commitment of its owners to create a more sustainable model for the music industry, TIDAL is available in premium and HiFi tiers—which includes Master Quality Authenticated (MQA) recordings as well as Sony’s 360 Reality Audio recordings.
In this role you will be a part of Data Analytics & Infrastructure team which is an integral part of TIDAL and its growing number of analytics and data science experts. TIDAL’s internal data platform enables members from Business Analytics, Product Analytics, Marketing and other teams to bring to light actionable insights.
You will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. You must be self-directed and comfortable supporting the data needs of multiple teams, systems and products
As a Data Engineer you will have the following responsibilities:
To participate in designing, building, and maintaining Tidal’s infrastructure for data ingestion, processing, analytics, and reporting by:
- Writing and maintaining Python libraries and applications
- Creating and automating batch and stream data processing workflows
- Architecting and managing tables and databases
- Analyzing and fixing errors in data pipelines
What we need:
- Competence in writing performant and stable code in Python 3.
- Familiarity with data processing and warehousing techniques, common data formats
- Knowledge of SQL programming and development of data warehouses
- Experience sending/receiving data with web APIs
- Develop ETL pipelines using Spark
- 3+ years relevant work experience
It’s a PLUS if you have:
- Experience with Apache Airflow or other workflow management tools
- Java/Scala development experience
- Experience with AWS cloud services
- It is also an advantage if you have knowledge of RESTful API development
Our technology stack:
- Data Lake on S3 + Parquet + AWS Glue + EMR
- Python 3
- Apache Spark
- Apache Airflow
- Amazon Athena
- Amazon Redshift
At Square, we value diversity and always treat all employees and job applicants based on merit, qualifications, competence, and talent. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. We will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of the San Francisco Fair Chance Ordinance. Applicants in need of special assistance or accommodation during the interview process or in accessing our website may contact us by sending an email to assistance(at)squareup.com. We will treat your request as confidentially as possible. In your email, please include your name and preferred method of contact, and we will respond as soon as possible.
At Square, we want you to be well and thrive. Our global benefits package includes:
- Retirement Plan
- Employee Stock Purchase Program
- Life Insurance
- Wellness Allowance
- Employee Assistance Programme
- Paid Parental Leave
- Paid Time Off
- Learning and Development Resources
Looking forward to your applications!