Portfolio Jobs

Search open roles at our #staymagical portfolio companies

Staff Data Developer

OXIO

OXIO

Software Engineering
United States
Posted on Tuesday, August 27, 2024
OXIO is the world’s first telecom-as-a-service (TaaS) platform. We are democratizing telecom and making it easily accessible for brands and enterprises to fully own and operate proprietary mobile networks designed to support their own customers needs. Our TaaS solution combines multiple existing networks into one single platform that can be seamlessly managed in the cloud as a modern SaaS offering. And it gets better - with full network access comes unparalleled business intelligence and insights to help enterprises better understand customer and machine (M2M) behavior. With a continuous focus on innovation, any company can build a powerful telecom presence with OXIO, and in addition help them glean unique customer insights like never before.
OXIO’s Data team is responsible for powering data-driven decision-making across the entire organization. In order for us to execute our mission effectively, we need to build a solid data foundation and ensure that every area of the business has access to highly reliable data.
We are hiring a talented and experienced Staff Data Engineer to join our small, but growing Data team, playing a critical role in designing and executing a robust and forward-looking data strategy for the company. Our team owns the data pipelines and tools that provide secure, reliable, and accessible data, enabling team members to derive actionable insights. Doing this job well means that we enable the entire organization’s ability to make more informed decisions, innovate faster, and serve our customers better.
In this role, you will work directly with our Data, Engineering, Operations, Data Science, Go-to-Market, and Finance teams to support the organization's data processing and analytics needs. You will serve as the internal expert on all things data engineering, empowering your peers with your expertise to collectively build a world-class data culture. This is a unique opportunity to directly influence not only our data systems, but also our drones and global operations. The ideal candidate will help us design systems that support the company’s needs today and many years into the future.

Key Responsibilities:

  • Help build, maintain, and scale our data pipelines that bring together data from various internal and external systems into our data warehouse.
  • Partner with internal stakeholders to understand analysis needs and consumption patterns.
  • Partner with upstream engineering teams to enhance data logging patterns and best practices.
  • Participate in architectural decisions and help us plan for the company’s data needs as we scale.
  • Adopt and evangelize data engineering best practices for data processing, modeling, and lake/warehouse development.
  • Advise engineers and other cross-functional partners on how to most efficiently use our data tools.
  • Develop data solutions through hands-on coding.

Key Qualifications:

  • 10+ years experience as a data engineer and/or analytics engineer building large scale data platforms and scalable data warehouses.
  • 5+ years of hands-on experience coding in Python or Spark for building and maintaining data pipelines.
  • 5+ years of experience working with AWS cloud, dbt and Snowflake, Databricks, BigQuery, Redshift or other data warehouses
  • Proficient with Dimensional Modeling (Star Schema, Kimball, Inmon) and Data Architecture concepts.
  • Advanced SQL skills (ease with window functions, defining UDFs)
  • Experience in implementing real-time and batch data pipelines with strict SLOs, and optimizing data storage and access patterns.
  • Proven track record of enhancing data reliability, discoverability, and observability.
  • Has good understanding of working with storage layers like Hudi, Delta Lake or Iceberg
  • Aptitude for product analysis, dashboarding, and reporting
  • Demonstrated success in leading large-scale projects across teams and mentoring others in Data Engineering best practices.

Nice To Haves:

  • Experience building streaming applications or pipelines using async messaging services or distributed streaming platforms like Apache Kafka
  • Knowledge of Airflow or some other orchestration toolhands-on experience with event-driven architecture and streaming data processing frameworks like Kafka, Spark, Flink
  • Experiences with time-series databases like Clickhouse, InfluxDB
  • Familiarity with infrastructure tooling such as Terraform/Pulumi and worked with Kubernetes
  • Familiarity with infrastructure tooling such as Terraform/Pulumi and worked with Kubernetes

What We Offer:

  • Competitive salary and stock option incentive program
  • Company contribution towards comprehensive benefit packages
  • Flexible work arrangements
  • Company sponsored team-lunches and company retreats
  • International organization that enables you to work across boundaries, travel to different locations and enjoy the dynamics of a rapidly growing startup
  • The opportunity to work with a talented and supportive team
  • A diverse and inclusive team