Microsoft Modern Data Platform Application Designer


June 11, 2021

Job Description
About Accenture: Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services-all powered by the world's largest network of Advanced Technology and Intelligent Operations centers. Our 514,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at
Accenture | Let there be change
We embrace change to create 360-degree value

  • Project Role :Application Designer
  • Project Role Description :Assist in defining requirements and designing applications to meet business process and application requirements.
  • Management Level :10
  • Work Experience :4-6 years
  • Work location :Chennai
  • Must Have Skills :Microsoft Modern Data Platform
  • Good To Have Skills :Apache Spark,Python Programming Language
  • Job Requirements :

    • Key Responsibilities : 1Experience in developing Spark applications using Spark-SQL in Databricks for data extraction, transformation, and aggregation from multiple file formats for Analyzing transforming the data to uncover insights into the customer usage patterns 2Extract Transform and Load data from sources Systems to GCP Data Storage services using a combination of GCP Data Lake components Data ingestion to one or more GCP services GCP BQ, GCP BQ ML, DataProc, etc and processing the data in GCP Databricks

    • Technical Experience :
      1Overall, 8 years of experience In Industry including 4Years of experience As Developer using Big Data Technologies like Databricks/Spark and Hadoop Ecosystems 2Hands on experience on Unified Data Analytics with Databricks, Databricks Workspace User Interface, Managing Databricks Notebooks, Delta Lake with Python, Delta Lake with Spark SQL 3Good understanding of Spark Architecture with Databricks, Structured Streaming 4Setting Up cloud platform with Databricks, Databricks Workspace

    • Professional Attributes :
      1Good knowledge of distributed data processing 2Good knowledge in BigQuery and other Relational and non-relations database 3Good knowledge of Data Management principles 4Hands on experience designing and delivering solutions on cloud platform preferably GCP 5Databrick Administration

    • Additional Information :
      1Experience on SQL for Data Analysis 2Hands on experience Data extraction extract, Schemas, corrup

15 years of full time education