Sathya Resume Architect Bigdata
Sathya Resume Architect Bigdata
14+ years of experience as Consultant ,Architect and Tech Manager in Full Life Cycle implementations
of BigData and DW-BI Projects.using HADOOP,MapReduce,Pig,Hive,HBase,Sqoop,NoSQL,MongoDB
Oracle Data Integrator, Informatica, IBM Datastage, OBIEE, OBIApps,SAP BO, Netezza , Teradata and
Cloud based Data integration
Summary Profile
Architect
Good Knowledge on Bigdata Integration and BigData Analytics using various BigData Tools
Experienced on Data Access and Analytics through PIG,HIVE,NoSQL and MongoDB
Experienced on Hadoop,Hbase,Sqoop and Mapreduce programming with Java.
Good experience in implementation of Large scale DW/BI projects using various ETL and
reporting , Data Quality Tools with strong customer facing
Published white papers internally on various Data Integration and Migration Frameworks and
involved in various accelerators development.
Very much experienced in Solution proposals, Architectural Reviews, Consulting Services,
POC’s ,Estimations, capacity planning and sizing etc.
Deep technical experience with architecture, design and hands-on programming on diverse
technology platforms including RDBMS, In-memory,BIGDATA, ETL engines, BI Scorecards &
Dashboards, Service Oriented Architectures (SOA).
Experienced in Enterprise metadata strategy building, metadata governance and Business
Glossary.
Experienced in End to End Implementation of
OBIApps7.x(OBIEE,Informatica,DAC),OBIApps 11.x(OBIEE,ODI) for Finance,
Procurement & Spend,HR,Supply Chain,Projects,Sales,Services and Manufacturing
Analytics Modules
Good knowledge on Enterprise metadata(business/technical,operation),Business
Glossory,metadata Reporting.
Experienced in Dimensional Modeling and industry specific IBM FSDL Datamodels
Experienced in managing Projects/Coordinate with cross functional teams /Define test
1
strategies/tracking status/Risk identification-mitigation/Delivering
Experienced in Project delivery models, Test strategies , test plans ,Tool assessments
Experienced in managing Large scale Projects/Coordinate with cross functional teams
/Define test strategies/tracking status/Risk identification-mitigation/Delivering
Experience in handling Big teams(more than 40+) across different types of project and
different types of technologies(Development, Enhancements, Support)
Exposure towards Agile,Water fall, iterative methodologies and experienced in building
competencies
Cloud Integration
Exposure cloud based Data integration using DELLBoomi
BIGDATA
Good programming skills using Core Java on Unix for MapReduce
Good experience on Bigdata integration Unstructured,Social media and moving data,stream
data with HADOOP/Hbase/Sqoop and ETL integration with informatica, ODI, Datastage
Experience in Informatica integration with social media netoworks,hadoop.
Exposure to moving data analytics using IBM streams.
Exposure to in-memory databases, Advance Analytics.
Good experience on Analytics using Pig/Hive/NoSQL
Provided Bigdata solutiona and implemented for major Online retail Client
Involved in Bigdata implementation for a major telecom and banking clients
2
LDAP/external table and also integration with eBS/SSO.
Experience in cache management, MUDE, usage tracking,
OBIEE deployments, catalog management Upgradations, performance tuning
Good experience on BI Apps implementation using Informatica and ODI technology stacks
Experience on DAC Configuration,ODI configuration manager
Expert knowledge of BI Applications including basic and advanced configurations with Oracle
eBS suite.
Informatica
Experience as ETL Architect using Informatica Power Center and IDQ
Experience on ETL server configuration ,sizing and performance tuning.
Hands on experience on all components of Informatica including complex mapping /workflow
development and performance tuning
Involved in QA support ,defect resolution and Peer reviews.
Experience Real-time data integration / Change data capture
Experience version upgrades and migration
Others
Good knowledge on LINUX and UNIX systems
Hands on experience on SAP BO as a reporting tool.
Good understanding of Teradata Architecture and Hands on experience on Teradata
BTEQ,Fast Export,Load,Mload and Tpump utilities.
Extensive experience in databases like Oracle 9i, Teradata ,Netezza, MS Access and
knowledge of MS-SQL Server,Sybase,
3
Hands on experience in Cognos/Business Objects
Experience
4
Data modeling and customization of IBM FSDM model
Global Analytics : A product of HSBC is a project sponsored by the Group to establish to
define and implement standard data management processes, common information
infrastructure and an aligned analytical and information management organization that
delivers a regionally implemented global model for Customer, Product and Transactions.
Architected ,Designed and Involved in End to END product development and
implementation.Technlogies used were
IBM infosphere Datastage, Quality stage, DB2,Teradata,Attunity,
People Analytics : Build a centralized database of core workforce metrics to evaluate high
level trends using standardized logic across regions and produce standard consistent
reporting to multiple customers..
Architected ,Designed and Involved in End to END product development and
implementation.Technlogies used were
IBM infosphere Datastage, Quality stage, DB2,Teradata,Attunity,
Thames Water :Thames Water Utilities Ltd is the largest clean water and wastewater
Services Company in the UK, serving nearly 12 million domestic and commercial customers
5
in London and Thames Valley. Around the world, the International Products and Services
divisions provide a wide range of services to over 21 million customers.
6
Interface and Migration Project( 3Com) 3Com is a communications manufacturing
and services company. It manufactures the communication devices such as switches hubs
routers.. etc and provides the services to the customer. It has the source system in SAP and this
project was intended to populate the dataware house in Oracle database by extracting the data
from SAP interface.
DWH for GSK Italy : GSK group is the leading provider of real time surveillance healthcare
information and market research to their global Primary Care Physicians( PCP) of a medical
group and healthcare companies. It provides syndicated research intelligence data, which tracks
and predicts illness and product sales for pharmaceutical and healthcare companies, marketing &
sales analysts, Data was available in the form of Text files, Oracle and MS Access. It was
extracted and loaded using Informatica tool. The Data Warehouse is populated on daily basis
using ETL Methods (Extraction, Transformation & Loading Methods) this ETL job was run on daily
basis to fed data to Data Warehouse.
Design and development of a generic Data Warehouse solution including an
enterprise data warehouse and a Management Accounting data mart
Designing of mappings using application source qualifier (Informatica power connect)
to extract data from SAP system
Configuring the application source qualifier and reconfiguring the ABAP procedures
which generated by SAP
Tuning the Application source qualifier by using look ups in qualifier it self.
Rule-based enrichment, content re-work and infrastructure projects are other areas of
work involved in developing the generic solution.
Designing of BO universe to prepare the Adhoc reports
Designing of standard reports in Business Objects
7
CDC : Attunity, Power Exchange
Scheduling Tools : Informatica Scheduler, Control-M,DAC
Data Modeling : Erwin
Cleansing Tools :Trillium, Quality Stage