Now you can Instantly Chat with GOVIND!
About Me
Highly skilled Application Engineer – Redshift AWS Cloud Engineer Data Analyst Data Warehouse Manager with around 3 years of experience. Expertise in Loading/Unloading data from Amazon Redshift. Built S3 buckets and managed policies for S3 bucke...e in Loading/Unloading data from Amazon Redshift. Built S3 buckets and managed policies for S3 buckets and used S3 bucket and Glacier for storage and backup on AWS . Expertise in managing Amazon Redshift access privileges. Experience in IAM (Identity and Access Management) for creating roles, users, groups and also implementing MFA (Multi-factor Authentication) to provide additional security to AWS account and its resources. Experience in creating S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application. Experience in creating IAM (Identity and Access Management) policies using script for users and groups. Amazon IAM service enabled to grant permissions and resources to users. Managed roles and permissions of users with the help of AWS IAM (Identity and Access Management) Basic knowledge in design and development of custom ETL components using Talend to migrate financial data from Oracle and other external sources like S3, Text Files into AWS Redshift. Basic knowledge of Data Extraction, aggregations and consolidation of financial data using Talend. Designing dashboards using Amazon QuickSight to get business insights from the data. Hands-on experience in use of the variety of QuickSight sources of data analysis, including files, AWS services, and on- premises databases. Created snapshots to take backups of the volumes and also images to store launch configurations of the EC2 instances. Expertise in Amazon Simple Storage Service(S3) for Amazon Redshift data load. Create external tables with partitions using Redshift. Initiating alarms in CloudWatch service for monitoring the servers performance, CPU Utilization , disk usage etc. to take recommended actions for better performance. Experience provisioning and spinning up AWS Clusters, significant knowledge of best practices and market trends pertaining to AWS Cloud Platform. Setup/Managing Databases on Amazon RDS. Monitoring servers through Amazon CloudWatch. Infrastructure Development on AWS by employing services such as Redshift, S3, EC2, RDS, Cloud Watch, Quicksight, etc. Worked on JIRA for defect/issues logging & tracking and documented all my work. Experience in IT having experience in Database Design, Development & Support of MS SQL Server 2008/2005/2000 and Oracle for Production/development. Proficient in Relational Database Management Systems (RDBMS). Involved in Design, Development and testing of the system. Developed SQL Server Stored Procedures, Tuned SQL Queries (using Indexes and Execution Plan) Developed User Defined Functions and created Views. Rebuilding Indexes and Tables as part of Performance Tuning Exercise Created Triggers to maintain the Referential Integrity. Expertise in Transact-SQL (DDL, DML, DCL) and in Design and Normalization of the database tables. Experience in implementing business logic using Triggers, Indexes, Views and Stored procedures. Extensive Knowledge of advance query concepts (e.g. group by, having clause, union so on). Proficient in Tuning T-SQL queries to improve the database performance and availability.
Show MoreSkills
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 3 Years
Intermediate
-
- 3 Years
Intermediate
-
-
-
-
-
-
- 3 Years
Intermediate
-
-
-
-
-
-
- 3 Years
Advanced
-
-
-
-
- 3 Years
Advanced
-
-
-
-
- 3 Years
Intermediate
-
-
-
-
-
- 3 Years
Intermediate
-
-
- 3 Years
Advanced
-
-
- 3 Years
Advanced
-
-
-
- 3 Years
Intermediate
-
- 3 Years
Advanced
-
-
-
- 3 Years
Advanced
-
-
-
-
-
-
-
-
-
Intermediate
-
-
-
- 3 Years
Intermediate
-
-
-
-
- 3 Years
Advanced
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
Portfolio Projects
Description
Vicharas vLens is a data management and analysis platform that facilitates the storage & analysis of huge data sets, transforming raw data into actionable insights. vLens hosts wide range of data sets for mortgage securities with the ability to add any new data set quickly. The platform provides ability to add any new data very fast and perform complex analysis like cross tabulation, time series analysis within seconds.
Show More Show LessDescription
Vicharas CMBS/CRE Solutions is the most advanced analytic framework available to commercial mortgage investors.With its grid-enabled computation architecture and comprehensive collateral analysis, the CMBS module of V* CMBS /CRE Suite ensures faster and more accurate valuations and provides portfolio managers, traders and analysts with extended support for making more informed trading decisions and effectively managing the risk exposure related to CMBS investments.
Show More Show LessDescription
V* CLO is a suite of software modules aimed at the analysis of CLOs, CBOs, CDOs, TRuPs, and other corporate-credit-linked securitizations. The solutions leverage Intex deal waterfall models, run on Windows and Linux platforms and provide unprecedented results in improving investors decision support, risk management, surveillance, and accounting procedures.
Show More Show LessDescription
- Used IAM(Identity and Access Management) for creating roles, users, groups and also implemented MFA(Multi-factor Authentication) to provide additional securityto AWS account and its resources.
- Created S3 buckets in the AWS environment to store files, sometimes which are required to serve
static content for a web application. - Created snapshots to take backups of the volumes and also images to store launch configurations of
the EC2 instances. - Use workbench as Redshift database tool.
- Loading/Unloading data from Amazon Redshift.
- Managed Amazon Redshift access privileges.
- Created IAM(Identity and Access Management) policies using JSON script for users and groups.
- Use Amazon Simple Storage Service(S3) for Amazon Redshift data load.
- Involved in huge data migrations, transfer using utilities Bulk import export.
- Daily basis Data Loading (ETL) and verify data into Database.
- Extensively worked using AWS services along with wide and in depth understanding of each one of
them. - Highly skilled in deployment, data security and troubleshooting of the applications using
AWS services. - Managed Amazon redshift clusters such as launching/saving the cluster.
- Implemented of the data analysis queries.
- Knowledge of Vertica, Amazon Redshift as Database Tools.
- Possess a good quantity of experience and competency in C, Java, HTML, MS-Excel.
- Maintained the historical data of the client.
- Tested the database by inserting test data.
Possess high working qualities with good interpersonal skills, high motivation, fast learner, good team
player and very proactive in problem solving to provide best solutions.
Description
- More than2 years of experience in IT having experience in Database Design, Development & Support of MS SQL Server 2008/2005/2000 for Production/development.
- Proficient in Relational Database Management Systems (RDBMS).
- Involved in Design, Development and testing of the system.
- DevelopedSQL Server Stored Procedures, Tuned SQL Queries( using Indexes and Execution Plan)
- DevelopedUser Defined Functions and created Views.
- Rebuilding Indexes and Tables as part of Performance Tuning Exercise
- Created Triggers to maintain the Referential Integrity.
- Reviewed existing business procedures and recommended and implemented changes.
- Expertise in Transact-SQL (DDL, DML, DCL) and in Design and Normalization of the database tables.
- Experience in implementing business logic using Triggers, Indexes, Views and Stored procedures.
- Extensive Knowledge of advance query concepts (e.g. group by, having clause, union so on).
- Proficient in Tuning T-SQL queries to improve the database performance and availability.
- Experienced hands in Query tuning and performance tuning.
- Responsible for setting preferences for various ad-hoc requests and distribution of tasks.
- Trouble shoot problems with the various teams including network related issues
- Creating and automating the regular Jobs.
- Experience in Creating and Updating Clustered and Non-Clustered Indexes to keep up the SQL Server Performance.
- Good knowledge in maintaining Referential Integrity by using Triggers and Primary and Foreign Keys.
- Transformation of data from one server to other servers using tools like bulk copy program (BCP), data transformation services, data conversions from legacy systems to SQL server.
- Excellent analytical, communication and interpersonal skills. Proficient in technical writing and presentations and a good team player.
Description
- Used MS-SQL Server 2005, 2008 as Database Tool.
- Creating variable and Opening and fetching the values in the variables from tables.
- Deleting, dropping of the objects.
- Joining two or more tables and inserting data into table using PL block.
- Creating an SQL Server Database manually and through wizard.
- Did backup and restoration of smaller databases
- Created new tables, written stored procedures, triggers for Application Developers and some user defined functions. Created SQL scripts for tuning and scheduling.
- Involved in performing data conversions from flat files into a normalized database structure.
- Developed source to target specifications for Data Transformation Services.
- Developed functions, views and triggers for automation
- Extensively used Joins and sub-Queries to simplify complex queries involving multiple tables and also optimized the procedures and triggers to be used in production.
- Provideddisaster recoveryprocedures and policies forbackup and recoveryof Databases.
- Performance Tuningin SQL Server 2000 usingSQL ProfilerandData Loading.
- Installing SQL Server Client side utilities and tools for all the front-end developers/programmers
- Possess a good quantity of experience and competency in C, Java, HTML, MS-Excel.
- Maintained the historical data of the client.
- Tested the database by inserting test data.
- Support to Database and other application support works.
- Support for loan modifications.
- Extensively worked using AWS services along with wide and in depth understanding of each one ofthem.
- Involved in huge data migrations, transfer using utilities Bulk import export.
- Daily basis Data Loading (ETL) and verify data into Database.