Location: Bethesda, MD / Remote   |   Full-Time   |   $150,000 - $250,000
Linux k8s Kafka GitLab ArgoCD Python C++ Kubernetes Containerization GCP AWS Azure Apache Kafka Google Cloud Pub/Sub Apache Airflow Data Processing Google Anthos Docker GitOps Terraform GitHub Bitbucket Teamcity Artifactory Prometheus EFK ElasticSearch fluentd Kibana TIGK Telegraph InfluxDB Grafana Kapacitor DataDog Sensu Jaeger Sentry OpsGenie PagerDuty Splunk Hashicorp Vault CyberArk Azure Key Vault Google Cloud Secret Manager AWS DataSync Aspera MinIO CloudSoda Apache Pulsar RabbitMQ Amazon Kinesis Apache Flume Apache Storm Apache Spark Streaming Google Cloud Pub/Sub DataOps Leadership Data Engineer
Black Canyon Consulting (BCC) is searching for Senior DataOps Engineer (DevOps) to support our work for the National Center for Biotechnology Information (NCBI) at the National Library of Medicine (NLM), an institute of the National Institutes of Health. This opportunity is full time and onsite/remote at the NCBI in Bethesda, MD and/or remote. VISA sponsorship is possible.

NCBI is part of the National Library of Medicine (NLM) at National Institutes of Health (NIH). NCBI advances science and public health by providing free access to biomedical literature and genomic data over the web, making it one of the 400 top most-visited sites in the world.

This is a great opportunity to work on challenging problems as part of a new DataOps Platform team at NCBI. Developing an enterprise-wide DataOps platform is a new initiative at NCBI. Stepping on decades of experience in dealing with some of industry's most vital data-intensive applications, NCBI approaches solving the problems at enterprise scale using modern technologies such as Kubernetes, GitOps and containerization.

Duties & Responsibilities:
The DataOps Platforms team:
* Develops and continuously improves DataOps platform.
* Develops and maintains common tools and libraries.
* Evaluates new technologies and practices.
* Helps NCBI developers with adoption of platform.
* Ensures compliance with the Federal application security regulations and standards by providing automated solutions and compliance pipelines.
* Embraces agile development and continuous improvement.
* Encourages growth mindset and offers leadership opportunities at any level.

Required Skills:
* 7+ years of experience in the field.
* Strong coding skills in at least one programing language are required (Python, C++, ...).
* Kubernetes, containerization.
* Google Cloud Platform (GCP), Amazon Web Services (AWS), Microsoft Azure or equivalent cloud services.
* Apache Kafka, Google Cloud Pub/Sub or equivalent.
* Apache AirFlow or equivalent.
* Experience with data processing applications and modern cloud-based data processing infrastructure.
* Linux command-line skills.

Bonus Skills:
* Google Anthos, Docker.
* GitOps tools: ArgoCD or equivalent.
* Infrastructure as code tools: Terraform or equivalent.
* GitLab, GitHub, Bitbucket, Teamcity, Artifactory, or equivalent products for SCM, CI/CD, artifact management.
* Modern observability and logging tools: Prometheus, EFK, TIGK, DataDog, Sensu, Jaeger, Sentry, OpsGenie, PagerDuty, Splunk, or equivalent.
* Secret Management tools: Hashicorp Vault, CyberArk, Azure Key Vault, Google Cloud Secret Manager or equivalent.
* Data transfer tools: AWS DataSync, Aspera, MinIO, CloudSoda or equivalent.
* Apache Pulsar, RabbitMQ, Amazon Kinesis, Apache Flume, Apache Storm, Apache Spark Streaming, Google Cloud Pub/Sub.
* Experience with best-practice design patterns in coding and architecture.
* Experience working in Agile environment.

Educational Requirements:
* B.S. in a STEM field (Engineering, Computer Science, Mathematics, Physics) or equivalent industry experience in Systems Engineering.

Benefits:
We attract the best people in the business with our competitive benefits package that includes medical, dental and vision coverage, 401k plan with employer contribution, paid holidays, vacation, and tuition reimbursement.
Post Date: May 14, 2025