Job Information
Anthropic PBC Staff+ Software Engineer, Data Infrastructure in Seattle, Washington
Responsibilities:
Within Data Infra, you may be matched to critical business areas including:
- *Data Governance & Access Control: *Design and implement robust access control systems ensuring only authorized users can access sensitive data. Build infrastructure for permission management, audit logging, and compliance requirements. Work on IAM policies, ACLs, and security controls that scale across thousands of users and systems.
- Financial Data Infrastructure: Build and maintain data pipelines and warehouses powering business-critical reporting. Ensure data integrity, accuracy, and availability for complex financial systems, including third party revenue ingestion pipelines; manage the external relationships as needed to drive upstream dependencies. Own the reliability of systems processing revenue, usage, and business metrics.
- *Cloud Storage & Reliability: *Architect disaster recovery, backup, and replication systems for petabyte-scale data. Ensure high availability and durability of data stored in cloud object storage (GCS, S3). Build systems that protect against data loss and enable rapid recovery.
- *Data Platform & Tooling: *Scale data processing infrastructure using technologies like BigQuery, BigTable, Airflow, dbt, and Spark. Optimize query performance, manage costs, and enable self-service analytics across the organization.
You might be a good fit if you:
- Have 10+ years (not including internships or co-ops) of experience in a Software Engineer role, building data infrastructure, storage systems, or related distributed systems
- Have 3+ years (not including internships or co-ops) of experience leading large scale, complex projects or teams as an engineer or tech lead
- Can set technical direction for a team, not just execute within it
- Have deep experience with at least one of:
- Strong proficiency in programming languages like Python, Go, Java, or similar
- Experience with infrastructure-as-code (Terraform, Pulumi) and cloud platforms (GCP, AWS)
- Can navigate complex technical tradeoffs between performance, cost, security, and maintainability
- Have excellent collaboration skills - you work well with both technical and non-technical stakeholders
Strong candidates may also have:
- Experience with security and compliance requirements (ITGC, GDPR, financial controls)
- Background in data warehousing, ETL/ELT pipelines, or analytics infrastructure
- Experience with Kubernetes, containerization, and cloud-native architectures
- Track record of improving data reliability, availability, or cost efficiency at scale
- Knowledge of column-oriented databases, OLAP systems, or big data processing frameworks
- Experience working in fintech, financial services, or highly regulated environments
- Security engineering background with focus on data protection and access controls
Technologies We Use:
- Data: BigQuery, BigTable, Airflow, Cloud Composer, dbt, Spark, Segment, Fivetran
- Storage: GCS, S3
- Infrastructure: Terraform, Kubernetes, GCP, AWS
- Languages: Python, Go, SQL
Deadline to apply:* None. Applications will be reviewed on a rolling basis.*
The annual compensation range for this role is listed below.
For sales roles, the range provided is the role's On Target Earnings ("OTE") range, meaning that the range includes both the sales commissions/sales bonuses target and annual base salary for the role.
Annual Salary:
$405,000 - $485,000 USD
Logistics
*Education requirements: *We require at least a Bachelor'