hero

Portfolio Company Careers

Discover career opportunities across PFG's network of portfolio companies

Distinguished Data Architect II

Calix

Calix

IT
Multiple locations
Posted on Wednesday, April 24, 2024
Calix is leading a service provider transformation to deliver a differentiated subscriber experience around the Smart Home and Business, while monetizing their network using Role based Cloud Services, Telemetry, Analytics, Automation, and the deployment of Software Driven Adaptive networks.

As part of a high performing global team, the ideal candidate will bring in deep expertise in data infrastructure, modeling, and architecture as a Distinguished Data Architect II , providing the technical leadership and drive the next generation architecture, design and implementation for the Calix Cloud Data Platform. This role reports into our Vice President, Software Engineering.

Responsibilities and Duties:

  • Understand Calix solutions, platforms and data infrastructure to help develop the right strategy and design a comprehensive data platform architecture supporting operational and analytical use cases of Calix Cloud.
  • Lead the design of a robust data architecture that guides data modeling, integration, processing, and delivery standards enabling modern data product development.
  • Collaborate with data scientists, analysts, and cross functional teams to design cohesive data models, database schemas and data storage solutions, consumption strategies and patterns.
  • Develop standards for database design and implementation of various strategic data architecture initiatives around master data management, data quality, data management policies /standards, data governance, privacy, and meta data management.
  • Drive optimizations for performance and cost around storage, processing, and consumption needs.
  • Work closely with Cloud product owners to understand, analyze product requirements, provide feedback, and deliver a complete solution.
  • Guide technical discussions within the Engineering group and make technical recommendations.
  • Technical leadership of software design to meet the requirements of service stability, reliability, scalability, and security.
  • Guide architecture testing for large scale data ingestion and transformations.
  • Actively participate and keep updated on the latest developments and technologies through various forums and seek to bring them to the team and apply in daily work.

Qualifications and Skills:

  • 15+ years of proven experience in modern data engineering, broader data landscape experience and exposure and solid software engineering experience.
  • 10+ years of development experience performing Data modeling, master data management and building ETL/data pipeline implementations.
  • Proven proficiency in architecting, designing and development experience with batch and real time streaming infrastructure and workloads.
  • Experience in evaluating, proposing the “right fit” technologies and in strategizing and implementing data platform transformations/migrations.
  • Excellent problem-solving experience in large scale complex data warehousing environment
  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows.
  • Strong understanding of data modeling, data architecture, and data governance, privacy and security principles and their practical implementations.
  • Expert level working knowledge of Data Lake technologies, data storage formats (Parquet, ORC, Avro) and query engines (BigQuery, Athena, Presto, Dremio), data schemas, optimization of queries and associated concepts for building optimized solutions at scale.
  • Strong understanding of distributed systems, Restful APIs and data consumption patterns.Working experience with the cloud-based data warehouse like (BigQuery, RedShift, Snowflake etc.)
  • Working experience integrating with BI frameworks like Qlik, ThoughtSpot, Looker, Tableau etc.
  • Hands on experience in implementing data pipeline for data ingestion and transformation to support BI analytics, and ML pipelines.
  • Experience in designing data streaming and event-based data solutions (Kafka, Kinesis, or like) Experience building data pipelines (Flink, Spark or like)
  • Strong proficiency of the following programming languages or similar - Scala, Python, Go
  • BS degree in Computer Science, engineering, or mathematics or equivalent experience.

Location:

  • This is a remote-based position that can be located anywhere in the United States or Canada.

Compensation will vary based on geographical location (see below) within the United States. Individual pay is determined by the candidate's location of residence and multiple factors, including job-related skills, experience, and education.

For more information on our benefits click here.

There are different ranges applied to specific locations. The average base pay range (or OTE range for sales) in the U.S. for the position is listed below.

San Francisco Bay Area Only:

208,200.00 - 432,400.00 USD Annual

All Other Locations:

181,000.00 - 376,000.00 USD Annual