(352) FASTTEK | (352) 327-8835
FASTTEK GLOBALpowered by Fast Switch - Great Lakes
info@fasttek.com
(352) FASTTEK | (352) 327-8835
Chennai, Tamil Nadu

Data Engineering Engineer 2 #1055243
Job Description:
  • Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately
 
Key Responsibilities:
  • Collaborate with business and technology stakeholders to understand current and future data requirements
  • Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis
  • Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow
  • Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data
  • Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks
  • Ensure optimum performance and identify improvement opportunities
 
Skills Required:
  • Python, Big Query, AI, AIRFLOW, GCP
 
Experience Required:
  • Engineer 2 Exp: 4+ years Data Engineering work experience
 
Education Required:
  • Bachelor's Degree
 
Education Preferred:
  • Master's Degree
 
Additional Information :
  • Bachelor's degree in Computer Science, Information Technology, Information Systems, Data Analytics, or a related field (or equivalent combination of education and experience).
  • 4+ years of experience in Data Engineering or Software Engineering, with at least 2 years of hands-on experience building and deploying cloud-based data platforms (GCP preferred).
  • Strong proficiency in SQL, Java, and Python, with practical experience in designing and deploying cloud-based data pipelines using GCP services like BigQuery, Dataflow, and DataProc.
  • Solid understanding of Service-Oriented Architecture (SOA) and microservices, and their application within a cloud data platform.
  • Experience with relational databases (e.g., PostgreSQL, MySQL), NoSQL databases, and columnar databases (e.g., BigQuery).
  • Knowledge of data governance frameworks, data encryption, and data masking techniques in cloud environments.
  • Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform and Tekton, and other automation frameworks.
  • Excellent analytical and problem-solving skills, with the ability to troubleshoot complex data platform and microservices issues.
  • Experience in monitoring and optimizing cost and compute resources for processes in GCP technologies (e.g., BigQuery, Dataflow, Cloud Run, DataProc).