Language

Full Stack Developer

Amazon Elastic Compute Cloud
Amazon Web Services
Analytics Platform
Apache Airflow
API Gateway
AWS Cloudformation
AWS Cloudwatch
Big Data
Change Management
Continuous Integration/Delivery
Database Modeling
Data Integrity
Data Modeling
Data Pipelines
Data Quality
Datasets
Data Visualization
Data Warehouse
Data Warehouses
Data Warehousing
Deployment
DEV OPS
Docker
ETL
Firebase
Git
GitLab
Golang
Google Cloud
Grafana
Hadoop
Identity and Access Management
JAVA
Jenkins
Kubernetes
Nosql
Power Bi
Prometheus
Python
Serverless Architecture
Shell Scripting
SQL
SSO
Tableau Software
Terraform
Workday
Description:

Full Stack Developer

  • Workday Job Title: Full Stack Developer
  • Duration: 02-Sep-2024 - 28-Mar-2025

Job Description: 

Highly experienced  Full stack Engineer with expertise in cloud engineering, data engineering, and data warehousing, particularly within the AWS  and google cloud ecosystem. Success in designing and implementing scalable cloud solutions, data pipelines, optimizing data architectures, and delivering high-quality analytics solutions. Proficient at collaborating with cross-functional teams to drive business insights and data-driven decision-making. Skilled in automating processes, and ensuring high availability and security in cloud environments.

●      implemented scalable cloud architectures for enterprise applications using AWS services for Data platforms

●      Develops and maintains CI/CD pipelines to automate deployment processes and ensure continuous integration.

●      Leads teams in designing and developing centralized data warehouses and data lakes. Preferably using redshift

●      Creates and optimizes ETL processes to ingest and transform large datasets for analytical purposes.

●      Utilizes Apache Airflow for scheduling and monitoring data workflows to ensure timely data delivery.

●      Implements data partitioning, compression strategies, and query optimization techniques.

●      Develops serverless applications using AWS Lambda and API Gateway to reduce operational costs.

●      Automates configuration management using tools like Ansible and Terraform.

●      Conducts security assessments and implemented IAM and VPC best practices to secure cloud environments.

●      Integrated monitoring and logging solutions using CloudWatch, Prometheus, and ELK Stack for performance and security monitoring.

●      Created automated reporting solutions and interactive dashboards using Tableau, Looker, and Power BI.

●      Conducts data quality checks and ensured data integrity across multiple data pipelines.

●      Ability to securely ingest data from public and other client's sources to better insights

Technical Skills:       

●      Cloud Platforms: AWS (EC2, S3, RDS, Lambda, CloudFormation, VPC, IAM, Redshift)

●      Data Engineering: ETL, Data Warehousing, Data Modeling, SQL, NoSQL

●      Big Data Technologies: Hadoop, Spark, Athena, materialized views

●      DevOps Tools: Docker, Kubernetes, Jenkins, Terraform, Git, Big query, Firebase

●      Workflow Orchestration: Apache Airflow

●      Programming Languages: Python, SQL, Java, Go, Bash

●      Data Visualization: Tableau, Looker, Power BI, Quicksight

●      Monitoring & Logging: CloudWatch, Prometheus, Grafana, ELK Stack

●      Other Tools: GitLab CI/CD

●      Security: Forerock, Keycloak, SSO solutions for data platforms

  • Requirements:

Notes: Onsite role - Plano, TX

  • We are looking for a Full stack developer who is has strong Java & python (Some Golang exposure) experience plus working exposure with data and analytics platforms.  We are are Data platform and analytics team.
  • Looking for some one who is familiar with technologies like Glue, Pyspark, and java. Someone who is able to understand the workings of functions that make a performant data platform. Need someone with strong hands working on experience.
  • You will be writing python functions, java functions and sometimes supporting existing Golang functions to help with the data platform day to day activities
  • Python, Java, Golang. Plus if you have worked in a data or analytics platform prior and have a data infrastructure, data warehouse experience.
  • 10 Years of overall experience is required.
  • Somebody who is strong in java and python – Golang
  • Data Centric candidate – Data pipeline. Redshift / Pyspark. Data Connector of their own.
  • AWS experience.
  • The project is approved till Mar 2025 and then will be extended further.

 

QUALIFICATION/ LICENSURE :
  • Preferred years of experience : No preferred years of experience required
  • Travel Required : No travel required
  • Shift timings: 9 AM to 5 PM
Job Location Plano, Texas
Pay USD 71.00 Per Hour
Contract Duration 9 month(s)