Tithe.ly is hiring a
Remote Data Engineer

📍 Remote

👤 Role Type: Full-time

💡 Desired Skills:

Data EngineerTableauSnowflakeAWSGoogle CloudCRMBusiness IntelligenceSQL
Join Tithe.ly, We’re on a Mission 

We’re a fully remote tech company that puts the Church first, serving over 50,000 churches around the world. We’re on a mission to see the local church have the financial stability and resources needed to build the Kingdom of God, and we work every day to deliver world-class, cost-effective technology tools to make the business of the church just a bit easier.

Job Description

The Data Engineer will play a pivotal role in designing, implementing, and maintaining our modern cloud data architecture. This position ensures data is reliable, timely, and optimized for consumption across the organization.

The ideal candidate is an ELT specialist with deep proficiency in dbt for transformation and Snowflake for scalable warehousing. This role requires a strategic, DataOps-focused mindset, strong collaboration skills, and a commitment to delivering measurable business impact.

Our data platform serves as the scalable backbone for product innovation, operational reporting, and strategic decision-making. We are seeking an accountable, results-oriented engineer to join us during a critical phase of platform development and investment.

Primary Responsibilities and Outcomes

The successful candidate will be responsible for driving meaningful, long-term impact across our data ecosystem. Within the first 6-12 months and beyond, they will focus on the following:

1. Architecting Trusted Data Assets (dbt, DataOps, Quality)

Outcome: Establish and continually advance our modern ELT environment (Snowflake/dbt) as the reliable backbone for corporate and team reporting by reducing data trust deficit and metric ambiguity.

Deliverables:
  • Design, build, and maintain production-grade data models (Silver/Gold layers) using dbt, applying software engineering best practices such as CI/CD, testing, and modular design.
  • Proactively develop and maintain clear documentation and data lineage using dbt artifacts to ensure downstream users have full visibility into data definitions and transformation logic.
  • Partner closely with data team in rigorous code reviews and uphold strong version control standards (Git) across all transformation code.

2. Ensuring Scalability and Pipeline Reliability (Snowflake, Fivetran, Orchestration)

Outcome: Achieve and maintain 97% uptime and low latency for core ELT pipelines while proactively managing and optimizing cloud compute costs.

Deliverables:
  • Architect and manage production data pipelines in Snowflake, using advanced SQL to optimize performance, warehouse configurations, and cost efficiency.
  • Monitor and administer data ingestion through Fivetran to ensure seamless and timely loading of operational and application data.
  • Troubleshoot and resolve complex data issues, performing bug fixes, performance analysis, and root-cause evaluation across the data pipeline.

3. Delivering Business Value and Self-Service Enablement (Sigma/Tableau)

Outcome: Expand data accessibility and operational efficiency through self-service capabilities, reducing ad-hoc requests and empowering teams across the organization (e.g., achieving measurable efficiency gains).

Deliverables:
  • Curate, maintain, and deploy reliable, purpose-built data assets (data marts) optimized for use in BI tools such as Sigma and Tableau.
  • Identify opportunities to automate manual data processes to improve runtime, reduce errors, and enhance data quality.
  • Partner with stakeholders to ensure data assets are practical, well-structured, and aligned to key business goals.

Qualifications

A Successful Candidate Will:
  • Drive Results: Consistently achieve measurable benchmarks and optimize delivery speed while maintaining quality and reliability.
  • Possess a Strategic Mindset: Ability to define mid- and long-term technical direction for data assets that align with overall business objectives.
  • Collaborate: Ability to work effectively in a fully distributed, cross-functional team environment, communicating clearly with both technical and non-technical audiences.
  • Be Action Oriented: Strong analytical mind and proactive problem-solving abilities, with the capacity to adapt quickly to new technologies and operational changes.

Technical Qualifications

Required: 
  • Experience: 3–5 years of dedicated experience in a Data Engineering, Analytics Engineering, or Data Science role, preferably within a high-growth SaaS environment.
  • Programming & Databases: Advanced proficiency in SQL for data querying, manipulation, and optimization. Proficiency in Python for scripting, pipeline development, and data processing.
  • Core Stack Mastery:
    • dbt (Data Build Tool): Proven success designing and implementing complex, modular data transformations using dbt Core, including Jinja templating, testing, and documentation generation. Documented certifications are a plus.
    • Snowflake: Deep operational experience with Snowflake architecture, performance tuning, and optimizing data loading mechanisms (e.g., Snowpipe).
    • Fivetran: Experience managing, optimizing, monitoring, and configuring automated data ingestion pipelines using Fivetran or a similar automated EL tool.
    • BI/Consumption Layer: Familiarity with data consumption enablement using Sigma, Tableau, or similar BI tools, and the ability to design data models optimized for analytical performance.
    • Data Governance: Knowledge of data governance principles, including data privacy, security, and compliance best practices

Preferred:
  • Familiarity with Stripe, Hubspot, and/or ZenDesk.  
  • Familiarity with cloud environments (AWS or GCP) for managing data lake ingestion points (e.g., AWS S3, GCP Cloud Storage).
  • Experience with Reverse ETL workflows (e.g., Hightouch, Census).
  • Experience with git dbt integration best practices.
  • Experience utilizing modern orchestrators such as Airflow, Dagster, or Google Cloud Composer for scheduling and dependency management.
  • Familiarity with StatSig, Amplitude, or other Product Development Feature Testing tools.

Additional Information

The starting annual salary for this role is between $120,000- $150,000.

Benefits

Health insurance, dental, vision for your family, 401K, paid time off, sick leave, parental leave, and more. We believe taking care of our team is important and want to be sure you have what you need. 

Tithe.ly is an Equal Opportunity Employer

Yourgiving, Inc. DBA Tithe.ly (herein, “the Company”) is an Equal Opportunity Employer that does not discriminate on the basis of actual or perceived race (including hairstyles), color, alienage or national origin, ancestry, citizenship status, sex, gender, gender identity, pregnancy, childbirth or related medical condition, religious creed, physical disability or handicap, mental disability or handicap, age, medical condition (cancer), marital status, veteran status, sexual orientation, genetic information, arrest record, or any other characteristic protected by federal, state or local law. Our management team is dedicated to this policy with respect to recruitment, hiring, placement, promotion, transfer, training, compensation, benefits, employee activities and general treatment during employment.

Benefits

  • 401(k)
  • Remote Work
  • Medical Insurance
  • Vision Insurance
  • Dental Insurance
  • Paid Time Off
  • 401(k) Matching
Tithe.ly Logo
💵 Salary: $120,000 - $150,000
📍 Location: Remote

Share this job:


Receive updates on similar jobs:

Want weekly Christian tech job updates?

Sign up for our newsletter to stay up to date with all of the latest Christian tech jobs.

💡Ideas+Bugs