About Nucleon Security
Nucleon Security is a leading provider of cybersecurity solutions that help businesses protect their data and systems from online threats. We are a team of experts who are passionate about helping our clients stay safe and secure in an ever-evolving digital landscape.
Job Description
We are looking for a talented Data Engineer who will be responsible for designing, building, and maintaining our cloud-based data infrastructure. The ideal candidate will be someone who has experience working with cloud-based data solutions, including but not limited to Microsoft Azure, Databricks, Airflow, MongoDB, and lake house architecture. You will be responsible for building and maintaining data pipelines, data models, and ETL processes, as well as ensuring data quality, reliability, and security.
Responsibilities
- Design and implement scalable and reliable data solutions using cloud-based technologies, including but not limited to Microsoft Azure, Databricks, and Airflow
- Build and maintain data pipelines using tools such as Apache Kafka, Apache Spark, and Apache Beam
- Develop and maintain data models and ETL processes using languages such as SQL, Python, and Scala
- Ensure data quality, reliability, and security by implementing data validation and testing frameworks, as well as monitoring and logging solutions
- Collaborate with our data scientists to ensure that our data is accurate, complete, and well-organized
- Optimize and tune our data solutions for performance and cost-effectiveness
- Maintain and support our data infrastructure, including MongoDB databases and lake house architectures
Requirements
- Master’s degree Bachelor’s degree in Computer Science, Engineering, or a related field
- Minimum of 3 years of experience in data engineering, with a focus on cloud-based data solutions
- Strong understanding of data modeling, ETL processes, and data warehousing
- Experience with at least one major cloud provider such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure
- Experience with data processing frameworks such as Apache Spark and Apache Beam
- Proficiency in SQL and at least one scripting language such as Python or Scala
- Experience with monitoring and logging solutions such as Grafana and Elasticsearch
- Strong analytical and problem-solving skills
- Excellent communication and teamwork skills
If you meet the requirements for this position and are interested in joining our team, please submit your resume and cover letter for consideration. We look forward to hearing from you!