Keywords
Application Programming Interfaces (APIs)
Mulesoft
Microsoft Azure
Cloud Computing
Databases
Continuous Integration
Data Science
Artificial Intelligence
Automation
Data Integration
Data Transformation
Human Resources
Innovation Management
Venta al por Menor
Salesforce.Com
Data Streaming
Technical Management
Data Processing
Kubernetes
Enterprise Integration
Technical Acumen
Docker
+ 12 more keywords
Attachments
Akash-Borgalli-Senior-MuleSoft-Developer_231124.pdf
Please upgrade to the business membership to download freelancers' CVs.
Skills
Over the past four years, I have gained extensive experience in enterprise integration development, specialising in MuleSoft. I have worked on numerous projects that involved designing and implementing APIs, integrating systems, and ensuring seamless data flow across platforms. My work has focused on delivering scalable, secure, and efficient solutions tailored to the needs of clients.
At Tata Consultancy Services, I led the development of APIs using the MuleSoft Anypoint Platform, handling the **end-to-end lifecycle** from design to deployment. I worked on a team of 14-16 members to manage complex integration projects, including point-to-point and point-to-many integrations. One of my notable achievements was the Maximo-IsoMetrix Integration project, where I automated the exchange of job plan information across systems using six system APIs and two process APIs. This involved leveraging DataWeave for data transformation and deploying APIs via Azure CI/CD pipelines, ensuring efficiency and reliability.
Before that, at CloudThat Technologies, I developed a real-time COVID-19 data integration system for the WHO, connecting multiple databases and enabling accurate global reporting. I also created APIs to integrate Salesforce, databases, and Blue Yonder for a retail domain project, showcasing my ability to handle diverse industry requirements. My responsibilities included developing Proofs of Concept (POCs) to demonstrate feasibility, implementing secure connections, and resolving 23+ support tickets to enhance existing MuleSoft applications.
During my time at Capgemini, I gained experience in building APIs for workforce analytics systems, automating the flow of employee lifecycle data. I implemented a Pub-Sub Asynchronous Integration Pattern for smooth data processing and applied RAML design, MUnit testing, and secure connections to maintain high performance. Additionally, I led a team of junior developers to create a chatbot using Azure Cognitive Services, showcasing my ability to mentor and lead technical initiatives.
Apart from my integration expertise, I bring a strong foundation in cloud technologies. I am a certified Microsoft Azure Data Scientist and Azure AI Engineer, with hands-on experience deploying solutions on Azure. My skills extend to working with Docker and Kubernetes, enhancing my ability to handle modern containerised environments.
With a Master’s in Data Science from MTU, Ireland, I possess a solid academic background that complements my technical skills. My ability to translate complex requirements into actionable solutions, combined with a commitment to continuous learning, has helped me deliver impactful results. Whether it's designing APIs, managing CI/CD pipelines, or troubleshooting integration challenges, I bring a proactive and collaborative approach to every task.
I am eager to contribute my expertise in integration development, cloud computing, and data science to deliver innovative solutions that drive organisational success.
Project history
Automated the exchange of Job Plan information between Maximo and IsoMetrix, using 6 system APIs and 2 process APIs for seamless data flow and document management across Maximo, IsoMetrix, SharePoint, and SFTP.
Developed the IsoMetrix System API to retrieve job plan details, update SharePoint URLs, and manage statuses, following MuleSoft best practices, including RAML design and MUnit testing.
Used DataWeave for data transformation and deployed the APIs via Azure CI/CD pipelines, collaborating with Process API teams to ensure cohesive data transfer and workflow efficiency.
The business requirement was to centralise all COVID-related data including confirmed cases, deaths, and vaccination in the UHO database, and ensure that these details were accurately reported to the World Health Organisation (WHO). The goal was for WHO to display this data on their global dashboard, providing real-time updates on COVID cases, deaths, and vaccinations worldwide.
This integration consisted of 1 Experience API, 1 Process API, and 3 System APIs, connecting UHO Database, Amazon S3, and WHO Data Warehouse to enable seamless data flow and reporting.
Responsible for developing integration flows for the UHub System API to handle COVID case types (Positive, Recovered, Death, Vaccinated), creating, updating, and managing records based on national_id and state, while ensuring data is sent to the Process API and further reported to Amazon S3 and WHO.
This project automated the publishing of employee life cycle events from SAP SuccessFactors to Global Workforce Analytics using MuleSoft, involving complex API integrations and data transformations.
The integration involved 1 experience API, 3 process APIs, and 14 system APIs, utilizing a Pub-Sub Asynchronous Integration Pattern for efficient data flow and processing across systems.
Created the SuccessFactors system API to generate enriched employee data, ensuring accurate delivery through process APIs, following best practices such as RAML design, MUnit testing, and secure connections.
Applied DataWeave for data transformation, deployed APIs using Azure CI/CD pipelines, and collaborated with cross-functional teams to ensure high performance and seamless integration.
Certifications
Salesforce Certified MuleSoft Developer - L1
2023