About Us:

At Parkar, we stand at the intersection of innovation and technology, revolutionizing software development with our cutting-edge Low Code Application Platform, Vector.ai. For almost a decade, our expertise has expanded to four countries, offering a full range of software development services, including product management, full-stack engineering, DevOps, test automation, and data analytics.

Vector.ai, our pioneering Low Code Application Platform, redefines software development by integrating over 500 modular code components. It covers UI/UX, front-end and back-end engineering, and analytics for a streamlined, efficient path to digital transformation through standardized software development and AIOps.

Our commitment to innovation has earned the trust of over 100 clients, from large enterprises to small and medium-sized businesses. We proudly serve key sectors like Fintech, Healthcare-Life Sciences, Retail-eCommerce, and Manufacturing, delivering tailored solutions for success and growth.

At Parkar, we don't just develop software; we build partnerships and pave the way for a future where technology empowers businesses to achieve their full potential.

For more info., Visit our website: https://parkar.digital/

LinkedIn: https://www.linkedin.com/compa...

About the Role

We are seeking a skilled Data Engineer to design, build and maintain data pipelines and analytics solutions that transform raw data into actionable insights.

Experience: 3-5 years

Responsibilities:

  • Collaborate with US-based customers to gather requirements, understand data integration needs, and provide technical recommendations.
  • Design and implement robust ETL pipelines to process data from multiple sources.
  • Create and optimize SQL queries, stored procedures, and database objects.
  • Develop and maintain data integration workflows using Azure Data Factory.
  • Ensure data quality, accuracy, and consistency across all data pipelines.
  • Collaborate with analytics teams to support data-driven decision making.
  • Document technical processes and maintain data flow diagrams.
  • Present technical solutions and alternatives to stakeholders.

Requirements:

  1. Advanced Microsoft SQL Server experience, including:
    o Complex PL/SQL programming.
    o Creating and optimizing stored procedures.
    o Implementing views and triggers.
    o Setting up and managing scheduled jobs.
  2. Proven experience with Azure Data Factory:
    o Building and maintaining data pipelines.
    o Implementing data transformation workflows.
    o Integration with various data sources.

Good to Have:

  • Basic working knowledge of Snowflake.
  • Experience with data visualization tools.
  • Understanding of data warehousing concepts.

    Qualifications:

    • Bachelor’s degree in computer science, Information Technology, or related field.
    • 4+ years of experience in data engineering or similar role.
    • Strong problem-solving and analytical skills.
    • Excellent communication and documentation abilities.
    • Experience working in global teams and communicating across time zones.
    • Strong customer-facing communication skills with ability to explain technical concepts clearly