Insight Global
Title: Data Engineer
Employment Type: Direct-Hire, FTE
Location: Remote (PST Hours)
Salary: $150,000 - $220,000
Required Skills & Experience
-3–5 years of professional experience designing and operating production data pipelines at scale.
Containerization & Orchestration: Expertise with Docker, Kubernetes, and Helm.
-Workflow Management: Hands-on experience building DAG-based pipelines in Apache Airflow.
-Programming: Strong proficiency in Python for data engineering tasks.
-Distributed Frameworks: Practical experience with Dask or Apache Spark for large-scale data processing.
-Cloud Fundamentals: Familiarity with deploying and managing services in a cloud environment.
-Compiled Languages: Experience writing data services in Go or Rust.
-GCP Proficiency: Hands-on with Google Cloud services (e.g., Pub/Sub, Big Query, Cloud Storage, GKE). Equivalent experience in other public cloud providers is fine.
-ML Pipelines: Exposure to deploying cross-cluster model-training workflows using Ray or similar frameworks.
Infrastructure as Code: Familiarity with Terraform for deployment.
-Security & Compliance: Knowledge of data governance, encryption, and role-based access control.
Nice to Have Skills & Experience
• Experience with Go programming language.
• Familiarity with acceleration frameworks such as RAPIDS or Spark.
• Knowledge of cloud platforms (AWS, GCP, Azure).
Experience with data version control and MLOps practices.
Job Description
We’re seeking a highly skilled Data Engineer to design, build, and maintain production-grade data pipelines that process and transform terabytes of data. In this role, you’ll collaborate closely with data scientists and other SWEs to ensure that our data infrastructure is scalable, reliable, and cost-effective.
Title: Data Engineer
Employment Type: Direct-Hire, FTE
Location: Santa Clara, CA (4 days onsite)
Salary: $180,000 - $220,000
Required Skills & Experience
• 8+ years of experience in Data Engineering, DevOps, or related fields.
• Experience setting up and managing multitenant Apache Spark and Kafka clusters.
• Strong experience in Kubernetes
• Strong programming skills in Python
• Expertise in processing, cleaning, managing, and analyzing large-scale datasets.
• Familiarity with database technologies and the ability to choose between NoSQL, SQL, and cloud-based solutions based on use case.
• Ability to manage multiple projects simultaneously and collaborate effectively in a team environment.
Strong problem-solving and communication skills.
Nice to Have Skills & Experience
• Proficiency with infrastructure automation tools such as Jenkins, Ansible, Terraform, or similar.
Job Description
Insight Global is looking for a Sr. Data Engineer (DevOps) to work 4 days on-site in Santa Clara, CA for a fast-growing technology company specializing in secure networking solutions—including SD-WAN, SASE, and SSE—that is helping global enterprises modernize their IT infrastructure for the cloud era. Led by industry veterans and backed by top-tier investors, the company is focused on innovation, scalability, and preparing for a future IPO.
The Data DevOps Engineer will join the Machine Learning and AI team to build and maintain scalable data infrastructure supporting intelligent systems and analytics. This role involves close collaboration with DevOps, Data Science, and Software Engineering teams.
• Design, build, and maintain robust data pipelines for ML/AI products.
• Integrate infrastructure automation and deployment workflows with DevOps teams.
• Implement new features and optimize data flow in collaboration with Data Scientists and Engineers.
• Enhance monitoring, logging, and debugging capabilities across data systems.
Ensure scalability, reliability, and performance of data infrastructure.