Skip to main content
close

Search Jobs

Senior Data Engineer, Client Authentication

Southlake, TX ; Austin, TX
Requisition ID 2026-119604 Category Engineering & Software Development Position type Regular Pay range USD $150,000.00 - $170,000.00 / Year Application deadline 2026-03-17
Apply

Your opportunity


At Schwab, you’re empowered to make an impact on your career. Here, innovative thought meets creative problem solving, helping us challenge the status quo and transform the finance industry together. We succeed as One Schwab—collaborating with trust, integrity, and a shared commitment to doing the right thing for our clients and each other.

We believe in the importance of in-office collaboration and fully intend for the selected candidate for this role to work on site in the specified location(s).

In this role, you’ll join Schwab’s Global Data and Analytics team to help build and evolve a large‑scale data intelligence platform on Google Cloud Platform (GCP). You’ll work at the intersection of data architecture and AI, engineering resilient pipelines that enable advanced analytics, fraud detection, and machine‑learning models at scale. This is a hands‑on, end‑to‑end engineering role where your work directly supports teams protecting clients and strengthening trust across the firm.

Key Responsibilities

  • Data Pipeline Architecture and Development: Design, build, and maintain scalable batch and streaming data pipelines using tools such as Dataflow (Apache Beam), Cloud Composer (Airflow), and Pub/Sub to ingest terabytes of transaction and behavioral data.
  • Advanced Coding: Write high‑performance, production‑grade Python and SQL, optimizing existing codebases for efficiency, latency, and cost.
  • Data Modeling: Implement complex data models in BigQuery, utilizing partitioning, clustering, and materialized views for optimal performance.
  • System Design: Architect robust backend data services and microservices to power analytics and AI platforms.
  • Infrastructure as Code: Write and maintain Terraform scripts to provision and manage GCP resources, ensuring reproducible and secure infrastructure.
  • Data Quality Engineering: Implement automated testing frameworks, data contracts, and anomaly detection systems into pipeline code.
  • Performance Tuning: Deep dive into query execution plans and pipeline bottlenecks to actively reduce latency and cloud costs.
  • Incident Resolution: Act as the highest level of escalation for critical data engineering issues, debugging complex failures in distributed systems.
  • Technical Leadership: Elevate team coding standards through rigorous code reviews and creation of solution architecture documents.
  • Mentorship: Mentor senior and junior engineers via pair programming and technical design sessions, helping them grow their skills.
  • Strategy: Collaborate with stakeholders to define the technical roadmap, selecting the right tools and patterns for long‑term success.

What you have


Required Qualifications

  • 8+ years of hands‑on software and data engineering experience with a proven track record of shipping complex systems to production.
  • 4+ years as a hands‑on senior engineer in startups and/or large organizations.
  • Bachelor’s degree in Computer Science or a related field.
  • Strong software engineering foundation, applying best practices (CI/CD, unit testing, modular design) to data pipelines.
  • Deep, practical experience with BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM.
  • Expert‑level proficiency in Python and SQL, with the ability to write clean, maintainable, and efficient code.
  • Mastery of dimensional modeling, distributed systems, and modern data‑stack patterns.
  • Extensive experience with workflow orchestration using Apache Airflow or Cloud Composer.
  • Strong background in dbt (data build tool) implementation and strategy.
  • Proven track record with CI/CD, Terraform (infrastructure as code), and containerization (Docker and Kubernetes).

Preferred Qualifications

  • Deep expertise in real‑time data processing using Kafka or Pub/Sub.
  • Deep understanding of big‑data frameworks such as Apache Beam or Spark.
  • Experience with modern data stacks such as Snowflake or Databricks, though GCP is our primary platform.
  • Demonstrated business‑domain knowledge in fraud analytics.
  • Strong written and verbal communication skills to clearly convey ideas and feedback.
  • Google Professional Data Engineer certification.
  • Master’s or advanced degree in Computer Science or a related field.

In addition to the salary range, this role is also eligible for bonus or incentive opportunities.


What’s in it for you

At Schwab, you’re empowered to shape your future. We champion your growth through meaningful work, continuous learning, and a culture of trust and collaboration—so you can build the skills to make a lasting impact. Our Hybrid Work and Flexibility approach balances our ongoing commitment to workplace flexibility, serving our clients, and our strong belief in the value of being together in person on a regular basis.

We offer a competitive benefits package that takes care of the whole you – both today and in the future:

  • 401(k) with company match and Employee stock purchase plan
  • Paid time for vacation, volunteering, and 28-day sabbatical after every 5 years of service for eligible positions
  • Paid parental leave and family building benefits
  • Tuition reimbursement
  • Health, dental, and vision insurance
Apply

Eligible Schwabbies receive

  • Medical, dental and vision benefits

  • 401(k) and employee stock purchase plans

  • Tuition reimbursement to keep developing your career

  • Paid parental leave and adoption/family building benefits

  • Sabbatical leave available after five years of employment