Designed high level ETL/MDM/Data Lake architecture for overall data transfer from the OLTP to OLAP with the help of multiple ETL/MDM tools and also prepared ETL mapping processes and maintained the mapping documents. Created RPD and Implemented different types of Schemas in the physical layer as per requirement. BI: OBIEE, OTBI, OBIA, BI Publisher, TABLEAU, Power BI, Smart View, SSRS, Hyperion (FRS), Cognos, Database: Oracle, SQL Server, DB2, Tera Data and NoSQL, Hyperion Essbase, Operating Systems: Windows 2000, XP, NT, UNIX, MS DOS, Cloud: Microsoft Azure, SQL Azure, AWS, EC2, Red shift, S3, RDS, EMR, Scripting: Java script, VB Script, Python, Shell Scripting, Tools: & Utilities: Microsoft visual Studio, VSS, TFS, SVN, ACCUREV, Eclipse, Toad, Modeling: Kimball, Inmon, Data Vault (Hub & Spoke), Hybrid, Environment: Snowflake, SQL server, Azure Cloud, Azure data factory, Azure blobs,DBT, SQL OBIEE 12C, ODI -12CPower BI, Window 2007 Server, Unix, Oracle (SQL/PLSQL), Environment: Snowflake, AWS, ODI -12C, SQL server, Oracle 12g (SQL/PLSQL). Implemented Change Data Capture (CDC) feature ofODI to refresh the data in Enterprise Data Warehouse (EDW). As such, it is not owned by us, and it is the user who retains ownership over such content. Delivering and implementing the project as per scheduled deadlines; extending post-implementation and maintenance support to the technical support team and client. 2+ years of experience with Snowflake. 4.3 Denken Solutions Inc Snowflake Developer Remote $70.00 - $80.00 Per Hour (Employer est.) Constructing enhancements in Matillion, Snowflake, JSON scripts and Pantomath. Used import and Export from the internal stage (snowflake) from the external stage (AWS S3). Operationalize data ingestion, data transformation and data visualization for enterprise use. Developed jobs in both Talend (Talend Platform for MDM with Big Data ) and Talend (Talend Data Fabric). Strong knowledge of SDLC (viz. Experience with Snowflake cloud-based data warehouse. Loading data into snowflake tables from the internal stage using snowsql. Talend MDM Designed and developed the Business Rules and workflow system. Design conceptual and logical data models and all associated documentation and definition. Worked with Kimball's Data Modeling concepts including data modeling, data marts, dimensional modeling, star and snowflake schema, fact aggregation and dimension tables . Performed data quality issue analysis using SnowSQL by building analytical warehouses on Snowflake. Reviewed high-level design specification, ETL coding and mapping standards. As such, it is not owned by us, and it is the user who retains ownership over such content. Designed and implemented a data compression strategy that reduced storage costs by 20%. Worked on performance tuning by using explain and collect statistic commands. Estimated $183K - $232K a year. Easy Apply 3d Strong experience with Snowflake design and development. Extensive experience in creating BTEQ, FLOAD, MLOAD, FAST EXPORT and have good knowledge about TPUMP and TPT. 2mo. Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift, Used JSON schema to define table and column mapping from S3 data to Redshift. Developed Mappings, Sessions, and Workflows to extract, validate, and transform data according to the business rules using Informatica. Worked on SnowSQL and Snowpipe, Converted Oracle jobs into JSON scripts to support the snowflake functionality. Security configuration in web logic server and both at Repository level and Webcat level. Ensured accuracy of data and reports, reducing errors by 30%. Postproduction validations like code and data loaded into tables after completion of 1st cycle run. Jpmorgan Chase & Co. - Alhambra, CA. Designed and implemented efficient data pipelines (ETLs) in order to integrate data from a variety of sources into Data Warehouse. Postman Tutorial for the Snowflake SQL API Build an Image Recognition App Build a Custom API in Python on AWS Data Pipelines Involved in Design, analysis, Implementation, Testing, and support of ETL processes for Stage, ODS, and Mart. Prepared the Test scenarios and Test cases documents and executed the test cases in ClearQuest. Built python and SQL scripts for data processing in Snowflake, Automated the Snowpipe to load the data from Azure cloud to Snowflake. Designed and developed a new ETL process to extract and load Vendors from Legacy System to MDM by using the Talend Jobs. Scheduled and administered database queries for off hours processing by creating ODI Load plans and maintained schedules. 130 jobs. Created roles and access level privileges and taken care of Snowflake Admin Activity end to end. Involved in monitoring the workflows and in optimizing the load times. Used Tab Jolt to run the load test against the views on tableau. Very good experience in UNIX shells scripting. Apply to Business Intelligence Developer, Data Analyst Manager, Front End Associate and more! Extensive experience in migrating data from legacy platforms into the cloud with Lyftron, Talend, AWS and Snowflake. Experience in various data ingestion patterns to hadoop. Senior Data Engineer. Tested Standard and ad-hoc SQL Server reports and compared the results against the database by writing SQL Queries. ETL Tools: Talend MDM 7.1/6.x/5.x, Informatica 7x/8x, SSIS, Lyftron, Big Data Technologies: Hadoop ecosystem, Spark, HDFS, Map Reduce, Hive, PIG, Sqoop, NOSQL, Reporting Tools: Business Objects XI R2, Cognos8.x/7.x, Micro strategy and MS Access Reports, Operating System: Windows NT/XP, UNIX. Involved in End-to-End migration of 40+ Object with 1TB Size from Oracle on prem to Snowflake. Good understanding of Azure Databricks platform and can build data analytics solutions to support the required performance & scale. Experience in using SnowflakeCloneandTime Travel. By clicking Build Your Own Now, you agree to ourTerms of UseandPrivacy Policy. Extensive work experience in Bulk loading using Copy command. Conducted ad-hoc analysis and provided insights to stakeholders. and created different dashboards. Have excellent quality of adapting to latest technology with analytical, logical and innovative knowledge to provide excellent software solutions. Dashboard: Elastic Search, Kibana. In-depth knowledge on Snow SQL queries and working with Teradata SQL, Oracle, PL/SQL. Created Oracle BI Answers requests, Interactive Dashboard Pages and Prompts. Well versed with Snowflake features like clustering, time travel, cloning, logical data warehouse, caching etc. Independently evaluate system impacts and produce technical requirement specifications from provided functional specifications. IDEs: Eclipse,Netbeans. Extensively used SQL (Inner joins, Outer Joins, subqueries) for Data validations based on the business requirements. 23 jobs. Good understanding of Teradata SQL, Explain command, Statistics, Locks and creation of Views. Over 8 years of IT experience in Data warehousing and Business intelligence with an emphasis on Project Planning & Management, Business Requirements Analysis, Application Design, Development, testing, implementation, and maintenance of client/server Data Warehouse. Ensuring the correctness and integrity of data via control file and other validation methods. (555) 432-1000 resumesample@example.com Professional Summary Over 8 years of IT experience in Data warehousing and Business intelligence with an emphasis on Project Planning & Management, Business Requirements Analysis, Application Design, Development, testing, implementation, and maintenance of client/server Data Warehouse. Implemented business transformations, Type1 and CDC logics by using Matillion. Experience in various methodologies like Waterfall and Agile. Strong Accounting knowledge of Cash Flow, Income Statements, Balance Sheet and ratio analysis. Worked on Cloudera and Hortonworks distribution. Performed data quality analysis using SnowSQL by building analytical warehouses on Snowflake. Used various SSIS tasks such as Conditional Split, Multicast, Fuzzy Lookup, Slowly Changing Dimension etc., which did Data Scrubbing, including data validation checks during Staging, before loading the data into the Data warehouse from Flat Files, Excel and XML Files. Involved in production moves. Produce and/or review the data mapping documents. Excellent experience in integrating DBT cloud with Snowflake. Extensively worked on data extraction transformation and loading form source to target system using BTEQ, FASTLOAD and MULTILOAD, Writing ad-hoc queries and sharing results with business team. Data Integration Tool: NiFi, SSIS. Designed new database tables to meet business information needs. Created the External Tables in order to load data from flat files and PL/SQL scripts for monitoring. Used FLATTEN table function to produce lateral view of VARIENT, OBECT and ARRAY column. Testing code changes with all possible negative scenarios and documenting test results. Check more recommended readings to get the job of your dreams. Expertise in develClaireping SQL and PL/SQL cClairedes thrClaireugh variClaireus PrClairecedures/FunctiClairens, Packages, CursClairers and Triggers tClaire implement the business lClairegics fClairer database. Expertise in developing Physical layer, BMM Layer and Presentation layer in RPD. Q: Explain Snowflake Cloud Data Warehouse. Duties shown on sample resumes of BI Developers include designing reports based on business requirements while using SSRS, designing ETL loads to load data into the reporting database using SSIS, and creating stored procedures and functions required to extract data for the load. Creating Repository and designing physical and logical star schema. Used Table CLONE, SWAP and ROW NUMBER Analytical function to remove duplicated records. Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python / Java. Experience in uplClaireading data intClaire AWS-S3 bucket using infClairermatiClairen amazClairenS3 plugin. In general, there are three basic resume formats we advise you to stick with: Choosing between them is easy when youre aware of your applicant profile it depends on your years of experience, the position youre applying for, and whether youre looking for an industry change or not. Customization to the Out of the Box objects provided by oracle. . Created data sharing between two snowflake accounts. DataWarehousing: Snowflake Teradata Served as a liaison between third-party vendors, business owners, and the technical team. Suitable data model, and develop metadata for the Analytical Reporting. Applied various data transformations like Lookup, Aggregate, Sort, Multicasting, Conditional Split, Derived column etc. Delta load, full load. Time traveled to 56 days to recover missed data. Created internal and external stage and transformed data during load. Used Informatica Server Manager to create, schedule, monitor sessions and send pre and post session emails to communicate success or failure of session execution. the experience section). Understanding of SnowFlake cloud technology. Highly skilled Snowflake Developer with 5+ years of experience in designing and developing scalable data solutions. Check the Snowflake Developer job description for inspiration. Experience in ETL pipelines in and out of data warehouses using a combination of Python and Snowflakes SnowSQL to Extract, Load and Transform data, then writing SQL queries against Snowflake. For example, instead of saying Client communication, go for Communicated with X number of clients weekly. Creating new tables and audit process to load the new input files from CRD. Productive, dedicated and capable of working independently. ETL TClaireClairels: InfClairermatica PClairewer Center 10.4/10.9/8.6/7.13 MuleSClaireft, InfClairermatica PClairewer Exchange, InfClairermatica data quality (IDQ). Expertise in architecture, design and operation of large - scale data and analytics solutions on Snowflake Cloud. DBMS: Oracle,SQL Server,MySql,Db2 Ability to develop ETL pipelines in and out of data warehouse using combination of Python and Snowflake's SnowSQL 2. Writing SQL queries against Snowflake. You're a great IT manager; you shouldn't also have to be great at writing a resume. Implemented Change Data Capture technology in Talend in order to load deltas to a Data Warehouse. Experience on performance Tuning by implementing aggregate tables, materialized views, table partitions, indexes and managing cache. Experience in Microsoft Azure cloud components like Azure data factory (ADF), Azure blobs and azure data lakes and azure data bricks. Created various Reusable and Non-Reusable tasks like Session. Developed and maintained data models using ERD diagrams and implemented data warehousing solutions using Snowflake. He Informatica developers are also called as ETL developers. Experience in analyzing data using HiveQL, Participate in design meetings for creation of the Data Model and provide guidance on best data architecture practices. Converted around 100 views queries from Oracle server to snowflake compatibility, and created several secure views for downstream applications. Cloud Technologies: Lyftron, AWS, Snowflake, RedshiftProfessional Experience, Software Platform & Tools: Talend, MDM, AWS, Snowflake, Bigdata, MS SQL Server 2016, SSIS, C#, Python, Sr. ETL Talend MDM, Snowflake Architect/Developer, Software Platform & Tools: Talend 6.x, MDM,AWS, Snowflake, Bigdata, Jasper, JRXML, Sybase 15.7, Sybase IQ 15.5, Sr. Talend, MDM,Snowflake Architect/Developer, Software Platform & Tools: Talend, MS Visio, MongoDB 3.2.1, ETL, Python, PyMongo, Python Bottle Framework, Java Script, Software Platform & Tools: Sybase, Unix Shell scripting, ESP scheduler, Perl, SSIS, Microsoft SQL server 2014, Software Platform & Tools: ETL, MFT, SQL Server 2012, MS Visio, Erwin, Software Platform & Tools: SQL Server 2007, SSRS, Perl, UNIX, ETL (Informatica), Dot Net(C#), Windows Services, Microsoft Visio, Talk to a Recruitment Specialist Call: (800) 693-8939, © 2023 Hire IT People, Inc. Amazon AWS, Microsoft Azure, OpenStack, etc. Experience in ETL pipelines in and out of data warehouses using Snowflakes SnowSQL to Extract, Load and Transform data. Involved in various Transformation and data cleansing activities using various Control flow and data flow tasks in SSIS packages during data migration. Participates in the development improvement and maintenance of snowflake database applications. Manage cloud and on-premises solutions for data transfer and storage, Develop Data Marts using Snowflake and Amazon AWS, Evaluate Snowflake Design strategies with S3 (AWS), Conduct internal meetings with various teams to review business requirements. Pappas and Snowflake evangelist Kent Grazianoa former data architect himselfteamed up to review the resume and offer comments on how both the candidate and the hiring company might improve their chances. Code review to ensure standard in coding defined by Teradata. Created complex views for power BI reports. Cloud Technologies: Snowflake,, SnowSQL, SnowpipeAWS. Sr. Informatica And Snowflake Developer Resume 3.00 /5 (Submit Your Rating) Hire Now SUMMARY Over 12 years of IT experience includes Analysis, Design, Development and Maintenance, 11 years of data warehousing experience using Informatica ETL (Extraction, Transformation and Loading Tool) Power center / Power mart, Power Exchange. Used the Different Levels of Aggregate Dimensional tables and Aggregate Fact tables. Environment: OBI EE 11G, OBI Apps 7.9.6.3, Informatica 7, DAC 7.9.6.3, Oracle 11G (SQL/PLSQL), Windows 2008 Server. Created ETL design docs, Unit, Integrated and System test cases. Created and scheduled iBots using delivers to send alerts, run reports, and deliver reports to the users. Data extraction from existing database to desired format to be loaded into MongoDB database. Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Oracle and Informatica PowerCenter. Postman Tutorial for the Snowflake SQL API , Get Started with Snowpark using Python Worksheets , Data Engineering with Apache Airflow, Snowflake & dbt , Get Started with Data Engineering and ML using Python , Get Started with Snowpark for Python and Feast , Build a credit card approval prediction ML workflow . Built a data validation framework, resulting in a 20% improvement in data quality. Created Dimensional hierarchies for Store, Calendar and Accounts tables. Reporting errors in error tables to client, rectifying known errors and re-running scripts. Developing ETL pipelines in and out of data warehouse using Snowflake, SnowSQL Writing SQL queries against Snowflake, Loaded real time streaming data using Snow pipe to Snowflake, Implemented the functions and procedures in snowflake, Extensively worked on Scale out, Scale up and scale down scenarios of Snowflake. Top 3 Cognizant Snowflake Developer Interview Questions and Answers. Wrote ETL jobs to read from web APIs using REST and HTTP calls and loaded into HDFS using java and Talend. Performed Debugging and Tuning of mapping and sessions. InvClairelved in all phases Clairef SDLC frClairem requirement gathering, design, develClairepment, testing, PrClaireductiClairen, user training and suppClairert fClairer prClaireductiClairen envirClairenment, Create new mapping designs using variClaireus tClaireClairels in InfClairermatica Designer like SClaireurce Analyzer, WarehClaireuse Designer, Mapplet Designer and Mapping Designer, DevelClairep the mappings using needed TransfClairermatiClairens in InfClairermatica tClaireClairel accClairerding tClaire technical specificatiClairens, Created cClairemplex mappings that invClairelved ImplementatiClairen Clairef Business LClairegic tClaire lClairead data in tClaire staging area, Used InfClairermatica reusability at variClaireus levels Clairef develClairepment, DevelClaireped mappings/sessiClairens using InfClairermatica PClairewer Center 8.6 fClairer data lClaireading, PerfClairermed data manipulatiClairens using variClaireus InfClairermatica TransfClairermatiClairens like Filter, ExpressiClairen, LClaireClairekup (CClairennected and Un-CClairennected), Aggregate, Update Strategy, NClairermalizer, jClaireiner, RClaireuter, SClairerter, and UniClairen, DevelClaireped WClairerkflClairews using task develClaireper, WClairerlet designer in WClairerkflClairew manager and mClairenitClairered the results using wClairerkflClairew mClairenitClairer, Building RepClairerts accClairerding tClaire user Requirement, Extracted data frClairem Claireracle and SQL Server then used Teradata fClairer data warehClaireusing, Implemented slClairewly changing dimensiClairen methClairedClairelClairegy fClairer accessing the full histClairery Clairef accClaireunts, Write Shell script running wClairerkflClairews in UNIX envirClairenment, Claireptimizing perfClairermance tuning at sClaireurce, target, mapping, and sessiClairen level. Mapping of incoming CRD trade and security files to database tables. Extensive experience in developing complex stored Procedures/BTEQ Queries. It offers the best of both worlds by combining sections focused on experience and work-related skills and at the same time keeping space for projects, awards, certifications, or even creative sections like my typical day and my words to live by. Created tables and views on Snowflake as per the business needs. Develop & sustain innovative, resilient and developer focused AWS eco-system( platform and tooling). MLOps Engineer with Databricks Experience Competence Skills Private Limited Work with domain experts, engineers, and other data scientists to develop, implement, and improve upon existing systems. Experience with Power BI - modeling and visualization. Expert in configuring, designing, development, implementation and using Oracle pre-build RPDs (Financial, HR, Sales, Supply Chain and Order Management, Marketing Analytics, etc.) Worked on performance tuning/improvement process and QC process, Supporting downstream applications with their production data load issues. Good exposure in cloud storage accounts like AWS S3 bucket, creating separate folders or each environment in S3 and then placing data files for external teams. Major challenges of the system were to integrate many systems and access them which are spread across South America; creating a process to involve third party vendors and suppliers; creating authorization for various department users with different roles. Worked on MDM modeling through MDM perspective through Talend 5.5.1 suite and developed jobs to push data to MDM. Many factors go into creating a strong resume. Proficient in creating and managing Dashboards, Reports and Answers. Dataflow design for new feeds from Upstream. ETL pipelines in and out of data warehouse using combination of Python and Snowflakes SnowSQL Writing SQL queries against Snowflake. More. Realtime experience with lClaireading data intClaire AWS clClaireud, S2 bucket thrClaireugh infClairermatiClairen. Created measures and implemented formulas in the BMM layer. Did error handling and performance tuning for long running queries and utilities. Used SNOW PIPE for continuous data ingestion from the S3 bucket. Sr. Snowflake Developer Resume - Hire IT People - We get IT done We provide IT Staff Augmentation Services! Worked on Snowflake Shared Technology Environment for providing stable infrastructure, secured environment, reusable generic frameworks, robust design architecture, technology expertise, best practices and automated SCBD (Secured Database Connections, Code Review, Build Process, Deployment Process) utilities, Worked on Snowflake Schemas and Data Warehousing. Work Experience Data Engineer Used sandbox parameters to check in and checkout of graphs from repository Systems. Created Views and Alias tables in physical Layer. Submit your resume Job description The Senior Snowflake Consultant will be proficient with data platform architecture, design, data dictionaries, multi-dimensional models, objects, star and snowflake schemas as well as structures for data lakes, data science and data warehouses using Snowflake. Good understanding of SAP ABAP. Experience in using Snowflake Clone and Time Travel. Strong Experience in Business Analysis, Data science and data analysis. Developed a real-time data processing system, reducing the time to process and analyze data by 50%. Led a team to migrate a complex data warehouse to Snowflake, reducing query times by 50%. Look for similarities between your employers values and your experience. Snowflake Architect & Developer Resume 3.00 /5 (Submit Your Rating) Hire Now SUMMARY: Overall 12+ years of experience in ETL Architecture, ETL Development, Data Modelling, Database Architecture with Talend Bigdata, Lyftron, Informatica, Apache Spark, AWS, NoSql, Mongo, Postgres, AWS Redshift & Snowflake. Created different types of reports such as Pivot tables, Titles, Graphs and Filters etc. Handled the ODI Agent with Load balancing features. People Data Labs. Extensive experience in creating complex views to get the data from multiple tables. ETL development using Informatica powercenter designer. Performance tuned the ODI interfaces and optimized the knowledge modules to improve the functionality of the process. Experience in buildingSnow pipe, Data Sharing, Databases, Schemas and Tablestructures. Created SQL/PLSQL procedure in oracle database. Analysing the input data stream and mapping it with the desired output data stream. Or else, theyll backfire and make you look like an average candidate. Need examples? Involved in writing procedures, functions in PL/SQL. Unless specifically stated otherwise, such references are not intended to imply any affiliation or association with LiveCareer. Expertise in configuration and integration of BI publisher with BI Answers and BI Server. Environment: OBIEE 11G, ODI -11g, Window 2007 Server, Agile, Oracle (SQL/PLSQL), Environment: Oracle BI EE (11g), ODI 11g, Windows 2003, Oracle 11g (SQL/PLSQL), Environment: Oracle BI EE 10g, Windows 2003, DB2, Environment: Oracle BI EE 10g, Informatica, Windows 2003, Oracle 10g, Talk to a Recruitment Specialist Call: (800) 693-8939, © 2023 Hire IT People, Inc. Check them out below! Worked in industrial agile software development process i.e. 3. Creating Reports in Looker based on Snowflake Connections, Experience in working with AWS, Azure and Google data services. Involved in creating test cases after carefully reviewing the Functional and Business specification documents. Root cause analysis for any issues and Incidents in the application. Develop stored procedures/views in Snowflake and use in Talend for loading Dimensions and Facts. Data moved from Netezza to Snowflake internal stage and then to Snowflake, with copy options. Here are a few tweaks that could improve the score of this resume: By clicking Build your own now, you agree to ourTerms of UseandPrivacy Policy. $116,800 - $214,100 a year. Creating ETL mappings and different kinds of transformations like Source qualifier, Aggregators, lookups, Filters, Sequence, Stored Procedure and Update strategy. (555) 432-1000 - resumesample@example.com Professional Summary Over 8 years of IT experience in Data warehousing and Business intelligence with an emphasis on Project Planning & Management, Business Requirements Analysis, Application Design, Development, testing, implementation, and maintenance of client/server Data Warehouse. Created Different types of reports including Union and Merged and prompts in answers and created the Different dashboards. Experience in Splunk repClairerting system. Experience on working various distributions of Hadoop like CloudEra, HortonWorks and MapR. Very good knowledge of RDBMS topics, ability to write complex SQL, PL/SQL, Evaluate Snowflake Design considerations for any change in the application, Design and code required Database structures and components. Designed and implemented a data retention policy, resulting in a 20% reduction in storage costs. Read data from flat files and load into Database using SQL Loader. Good knowledge on Unix shell scriptingKnowledge on creating various mappings, sessions and Workflows. Experience with Snowflake cloud data warehouse and AWS S3 bucket for integrating data from multiple source system which include loading nested JSON formatted data into snowflake table. Good knowledge on Snowflake Multi - Cluster architecture and components. Creating reports and prompts in answers and creating dashboards and links for the reports. All rights reserved. Snowflake for Developers Explore sample code, download tools, and connect with peers Get started with Snowflake Apps Create apps that auto-scale and can be deployed globally. Database: Oracle 9i/10g, 11g, SQL Server 2008/2012, DB2, Teradata, Netezza, AWS Redshift, Snowflake. Its great for applicants with lots of experience, no career gaps, and little desire for creativity. Spark, Hive: LLAP, Beeline, Hdfs,MapReduce,Pig,Sqoop,HBase,Oozie,Flume, Hadoop Distributions: Cloudera,Hortonworks. Extracting business logic, Identifying Entities and identifying measures/dimension out from the existing data using Business Requirement Document. Created Data acquisition and Interface System Design Document. For long running scripts, queries identifying join strategies, issues, bottlenecks and implementing appropriate performance tuning methods. Participated in client business need discussions and translating those needs into technical executions from a data standpoint. Handled the performance issues by creating indexes, aggregate tables and Monitoring NQSQuery and tuning reports. Migrate code into production and Validate data loaded into tables after cycle completion, Creating FORMATS, MAPS, Stored procedures in Informix database, Creating/modifying shell scripts to execute Graphs and to load data to into tables by using IPLOADER. Tuned the slow performance queries by looking at Execution Plan. GClaireClaired knClairewledge with the Agile and Waterfall methClairedClairelClairegy in the SClaireftware DevelClairepment Life Cycle. Worked on Oracle Databases, RedShift and Snowflakes. Experience in data modeling, data warehouse, and ETL design development using RalphKimball model with Star/Snowflake Model Designs with analysis - definition, database design, testing, and implementation process. Participated in sprint calls, worked closely with manager on gathering the requirements. Developed workflow in SSIS to automate the tasks of loading the data into HDFS and processing using hive. Good working knowledge of any ETL tool (Informatica or SSIS).