(352) FASTTEK | (352) 327-8835
FASTTEK GLOBALpowered by Fast Switch - Great Lakes
info@fasttek.com
(352) FASTTEK | (352) 327-8835
Chennai, Tamil Nadu
Software Core Senior Engineer #1045010
Job Description:
  • Employees in this job function are responsible for designing, developing, testing and maintaining software applications and products to meet customer needs.
  • They are involved in the entire software development lifecycle including designing software architecture, writing code, testing for quality and deploying the software to meet customer requirements.
  • Full-stack software engineering roles, who can develop all components of software including user interface and server side also fall within this job function.
 
Key Responsibilities:
  • Engage with customers to deeply understand their use-cases, pain points, and requirements, showcasing empathy and advocating for user-centric software solutions
  • Solve complex problems by designing, developing, and delivering using various tools, languages, frameworks, methodologies (like agile) and technologies
  • Assess the requirements of the software application or service and determine the most suitable technology stack, integration method, deployment strategy, etc.
  • Create high-level software architecture designs that outline the overall structure, components, and interfaces of the application
  • Collaborate with cross functional teams like product owners, designers, architects etc.
  • Define and implement software test strategy, guidelines, policies and processes in line with organization vision, industry regulations and market best practices.
  • Work on continuously improving performance and optimizing the application and implement new technologies to maximize development efficiency
  • Familiar with programming practices, such as test-driven development, continuous integration, and continuous delivery
  • Support security practices to safeguard user data including encryption and anonymization.
  • Create user-friendly and interactive interfaces
  • Develop and maintain back-end applications like APIs and microservices using server-side languages
  • Evaluate and incorporate emerging technologies and capabilities to deliver their solutions, and are monitoring and participating in solutions for new stack layers, often involving industry collaboration
 
Skills Required:
  • Data Architecture, GCP
 
Skills Preferred:
  • Cloud Architecture
 
Experience Required:
  • Senior Engineer Exp: Prac. In 2 coding lang. or adv. Prac. in 1 lang.; guides.
  • 10+ years in IT; 8+ years in development
 
Experience Preferred:
  • Requires a bachelor's or foreign equivalent degree in computer science, information technology or a technology related field ·
  • 8 years of professional experience in: o Data engineering, data product development and software product launches o At least three of the following languages: Java, Python, Spark, Scala, SQL and experience performance tuning.
  • 4 years of cloud data/software engineering experience building scalable, reliable, and cost-effective production batch and streaming data pipelines using: o Data warehouses like Google BigQuery.
  • Workflow orchestration tools like Airflow.
  • Relational Database Management System like MySQL, PostgreSQL, and SQL Server.
  • Real-Time data streaming platform like Apache Kafka, GCP Pub/Sub
  • Microservices architecture to deliver large-scale real-time data processing application.
  • REST APIs for compute, storage, operations, and security.
  • DevOps tools such as Tekton, GitHub Actions, Git, GitHub, Terraform, Docker.
  • Project management tools like Atlassian JIRA
  • Automotive experience is preferred
  • Support in an onshore/offshore model is preferred
  • Excellent at problem solving and prevention.
  • Knowledge and practical experience of agile delivery
 
Education Required:
  • Bachelor's Degree
 
Education Preferred:
  • Certification Program
 
Additional Information :
  • Design and implement data-centric solutions on Google Cloud Platform (GCP) using various GCP tools like Big Query, Google Cloud Storage, Cloud SQL, Memory Store, Dataflow, Dataproc, Artifact Registry, Cloud Bauild, Cloud Run, Vertex AI, Pub/Sub, GCP APIs.
  • Build ETL pipelines to ingest the data from heterogeneous sources into our system
  • Develop data processing pipelines using programming languages like Java and Python to extract, transform, and load (ETL) data
  • Create and maintain data models, ensuring efficient storage, retrieval, and analysis of large datasets
  • Deploy and manage databases, both SQL and NoSQL, such as Bigtable, Firestore, or Cloud SQL, based on project requirements
  • Optimize data workflows for performance, reliability, and cost-effectiveness on the GCP infrastructure.
  • Implement version control and CI/CD practices for data engineering workflows to ensure reliable and efficient deployments.
  • Utilize GCP monitoring and logging tools to proactively identify and address performance bottlenecks and system failures
  • Troubleshoot and resolve issues related to data processing, storage, and retrieval.
  • Promptly address code quality issues using SonarQube, Checkmarx, Fossa, and Cycode throughout the development lifecycle
  • Implement security measures and data governance policies to ensure the integrity and confidentiality of data
  • Collaborate with stakeholders to gather and define data requirements, ensuring alignment with business objectives.
  • Develop and maintain documentation for data engineering processes, ensuring knowledge transfer and ease of system maintenance.
  • Participate in on-call rotations to address critical issues and ensure the reliability of data engineering systems.
  • Provide mentorship and guidance to junior team members, fostering a collaborative and knowledge-sharing environment.