MeghGen

We are MeghGen Technologies Private Limited and at the forefront of cloud-native data solutions, specializing in delivering innovative services across GCP, AWS and Azure platforms.

  • Forward-thinking: We embrace change and constantly seek new ways to push the boundaries of what’s possible.
  • Collaborative: We believe in open communication and teamwork, ensuring everyone has a chance to contribute and learn.
  • Flat hierarchy: No stuffy titles here. We value your expertise and encourage open dialogue across all levels.
  • Innovation driven: We motivate and empower our team members to explore new ideas and implement cutting-edge solutions.
  • Cloud-native specialists: We have deep expertise in AWS, Azure, and GCP platforms, and specifically in Google Cloud Platform products and services.
  • Data specialists: We are data crafters and play with data pipelines 

JOB OPENINGS

Google Cloud Data Engineer

We’re not just looking for employees; we’re looking for passionate Data Engineering individuals who become integral parts of our driven team.

You should have experience building and optimizing data solutions on the Google Cloud Platform (GCP). In this pivotal role, you will take the lead in architecting, implementing, and managing our data infrastructure, ensuring high performance, reliability, and scalability. Your primary focus will be on developing comprehensive data pipelines, integrating disparate data sources, and constructing robust data warehousing systems, all within the GCP ecosystem.

Candidates with equivalent experience in AWS can also apply.

Minimum Experience: 2+ years

Key Responsibilities

  • Design and implement effective data solutions and models on GCP to support data analytics and business intelligence.
  • Develop and maintain scalable and reliable data pipelines to ingest, process, and distribute large volumes of data across the organization.
  • Optimize data architecture and data pipelines to improve performance and efficiency.
  • Collaborate with cross-functional teams to gather and translate business requirements into technical specifications and data models.
  • Ensure data integrity, quality, and compliance with data governance and security policies.

Technical Skills and Qualifications

  • Strong proficiency in GCP services such as BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
  • Demonstrated experience in data modeling, ETL development, and data warehousing principles.
  • Proficiency in SQL and experience with programming languages relevant to data engineering, such as Python or Java.
  • Familiarity with data integration tools and techniques, as well as experience with data visualization tools.
  • Knowledge of best practices in data engineering, including data integrity, data security, and disaster recovery.

Ideal Candidate Profile

  • A self-starter with a track record of delivering data engineering projects with a high degree of ownership, accountability, and commitment.
  • Excellent problem-solving skills and the ability to work independently as well as collaboratively in a team environment.
  • Strong communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
  • If you are passionate about leveraging data to drive business outcomes and thrive in a dynamic, fast-paced environment, we encourage you to apply. Join us in harnessing the power of data and technology to shape the future of our business.

Job Location

Ideally Bangalore but we are open for remote candidates who are great

Employment Type

Full-time

Google Cloud Engineer

We’re seeking a highly skilled and experienced Google Cloud Engineer to join our dynamic team. As a key member of our organization, you will play a crucial role in designing, implementing, and managing our cloud infrastructure on the Google Cloud Platform (GCP).
In this role, you will leverage your expertise in cloud migration, infrastructure provisioning, DevOps, and related technologies to build and optimize our cloud environment. Your strong background in UI development, Java, Go, or Python will be an added advantage.

Candidates with equivalent experience in AWS can also apply. 

Minimum Experience: 2+ years

Key Responsibilities

  • Design and implement effective data solutions and models on GCP to support data analytics and business intelligence.
  • Create scripts and provision and manage cloud migration tools for VMs, databases, networks, and other infrastructure components.
  • Design and implement scalable and resilient cloud architecture on GCP using Terraform.
  • Collaborate with development teams to containerize applications and deploy them on GKE clusters.
  • Implement and manage CI/CD pipelines using tools like Jenkins and GitLab.
  • Optimize cloud infrastructure for performance, cost-efficiency, and reliability.
  • Ensure the security and compliance of our cloud environment by implementing best practices and adhering to industry standards.
  • Monitor and troubleshoot cloud infrastructure using tools like DataDog, Splunk, and Google Stackdriver.

Technical Skills and Qualifications

  • Extensive experience in cloud migration, with the ability to create scripts and provision and manage migration tools for VMs, databases, networks, etc.
  • Hands-on experience in building GCP infrastructure using Terraform.
  • Proficiency in DevOps tools such as GitHub, Jenkins, and GitOps.
  • Experience in containerization technologies like Docker and orchestration platforms like Kubernetes, with specific experience in GCP GKE.
  • Expertise in GCP compute services such as GCE, GKE, Cloud Run, and Storage.
  • Familiarity with alerting and monitoring tools like DataDog, Splunk, and Google Stackdriver.
  • Strong experience in shell scripting.
  • Background in UI development, Java, Go, or Python is a significant advantage.
  • Experience in building and deploying CI/CD pipelines using Jenkins and GitLab.

Ideal Candidate Profile

  • A proactive problem-solver who can anticipate and mitigate potential issues in our cloud environment.
  • Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams.
  • Passionate about staying up-to-date with the latest trends and advancements in cloud technologies and DevOps practices.
  • If you thrive in a fast-paced, innovative environment and are excited about leveraging the power of GCP to drive our business forward, we encourage you to apply. Join us in shaping the future of our cloud infrastructure and delivering exceptional solutions to our customers.

Job Location

Ideally Bangalore but we are open for remote candidates who are great

Employment Type

Full-time

Snowflake Developer

We are seeking a skilled Snowflake Developer to join our data engineering team. The ideal candidate will have 3-4 years of experience in designing, developing, and maintaining data solutions using Snowflake’s cloud-based data warehousing platform.

Key Responsibilities

 

  1. Design, develop, and maintain Snowflake data warehousing solutions to support business intelligence and analytics initiatives.
  2. Collaborate with cross-functional teams to understand data requirements and translate them into efficient Snowflake implementations.
  3. Optimize Snowflake performance by designing and implementing efficient data models, queries, and data pipelines.
  4. Develop and maintain ETL processes using Snowflake’s SQL and other ETL tools.
  5. Implement data security measures and ensure compliance with data governance policies.
  6. Monitor and troubleshoot Snowflake performance issues and implement necessary optimizations.
  7. Stay up-to-date with Snowflake’s latest features and best practices, and continuously improve the organization’s data warehousing solutions.

Requirements

  • Bachelor’s degree in Computer Science, Information Technology, or a related field
  • 3-4 years of experience in developing and maintaining Snowflake data warehousing solutions
  • Strong proficiency in SQL and experience with Snowflake’s SQL extensions
  • Experience with data modeling, data warehousing concepts, and best practices
  • Knowledge of ETL tools and processes, such as Informatica, Talend, or Snowflake’s native data integration capabilities
  • Familiarity with cloud computing platforms, such as AWS or Azure
  • Experience with data security and governance practices
  • Strong problem-solving and analytical skills
  • Excellent communication and collaboration skills

Preferred Qualifications

  • Certification in Snowflake (SnowPro Core Certification or SnowPro Advanced Certification)
  • Experience with data visualization tools, such as Tableau or Power BI
  • Knowledge of data lake architectures and experience with tools like AWS S3 or Azure Data Lake Storage
  • Familiarity with agile development methodologies

Job Location

Ideally Bangalore but we are open for remote candidates who are great

Employment Type

Full-time

Reach out to us

Our experts will get back to you shortly

Send us your application...

    Google Cloud Data EngineerGoogle Cloud Engineer

    Copyright 2024 Meghgen. All rights reserved.

    Scroll to Top