Looking for a Data Warehouse Architect position in the Washington DC metro . Have expertise in architecting multi tera byte data warehouse solutions ;
Richard D. Hagedorn
SENIOR TERADATA ARCHITECT – ONSITE.
I AM TERADATA CERTIFIED & IBM CERTIFIED.
PLEASE DO NOT CONTACT ME UNLESS THIS IS C2C.
My rate for projects is $100, PLUS EXP OR 125 AI, C2C, NET 30.
I am a US citizen.
I am immediately available and currently located on the east coast.
Contact me at e-mail: email@example.com 727 642-8184
PLEASE USE EMAIL FOR CONTACT SO THAT WE CAN MAINTAIN DOCUMENTATION OF OUR AGREEMENTS.
I have had extensive experience with Teradata’s LDM for logical design development in the industries of banking, DOD, manufacturing and retail. I have extensively employed the Kimball design method for dimensional modeling. I have extensive use of Teradata’s MDM (master data modeler)in conjunction with Teradata’s LDM (logical data modeler) for these industries. These concepts are important in logical and physical design of the staging area warehouse and EDW tables. My most recent being at DOD (Army and Navy) and Bank of America with the corporate customer warehouse which involved the creating and supporting of the expense distribution work file tables and Business Objects ad-hoc workloads (or other B.I. tools such as DataStage, Cognos, and MicroStrategy). One of my primary responsibilities was performance tuning of the warehouse which included running explains on sql code, running a macro which I wrote to gather the important performance metrics and give the results regarding compliance. Working with the index wizard and using Teradata Performance monitor to look at global and specific issues on the system. As the a primary ARCHITECT, architect and modeler both at the bank (PRDS) and on other projects I was directly involved with the design of the warehouse. I was deeply involved with the design of the new Bank Data Warehouse (BDW) at BoA. This was our single point of truth at the bank; with the design including business intelligence (BI) and encompassed activities such as Know The Customer (KTC) and anti-money laundering (AML) intelligence. At my many projects I defined and developed ARCHITECT standards including database view design and extract translate and load (ETL) processes (from the operational data store(ODS) along with experience in Data Stage; to the Raw DB, then to Staging, having gone through the ETL, and finally to Target). With both the view and ETL I have been highly involved with performance such as an enterprise view with 32 joins, to 12 tables churning through 2.5 billion rows. I tuned this view from nearly 8 minutes run time to 40 seconds, primarily through collection of stats (something that was overlooked for a long time.). I have extensive experience creating the warehouse from scratch on new hubs, setting up users using Teradata Administrator, establishing ROLE relationships for authorization, and access to the database (DB) objects such as tables, view, macros, functions and procedures. I have worked closely with Teradata support to complete “floor sweeps” with upgrades to Teradata(TD) as the growth for our warehouse increased, thus utilizing the scalability of TD. Additionally, at Verizon, I worked directly with Teradata systems and operational staff to apply patches and test results over development and test, UAT and into production. Further I worked with the Teradata help desk to resolve issues and potential problems as identified by consistent up to date systems monitoring and historical monitoring. I have 25 years of IT experience with 20 of those being in database. 15 years as a Teradata Architect, Modeler and ARCHITECT, which includes 4years in development and 3 years in production, 18 years in performance work ,and within that, 12 years monitoring databases, 8 years in architecture and design work, 5 years in backups and disaster recovery (DR), over 15 years in standards work, 18 in managing DB access, over 15 years working with help desks. 20 years with development DB’s, 7 years in security administration, 5 years as primary point of contact for Teradata issues. I have managed teams of ARCHITECT for over 7 years of that total of 25 years.
I have extensive use of Erwin, Teradata Administrator, Teradata Monitor, Teradata Manager, TASM, Teradata SQL Assist 7.2, Teradata Performance Monitor. My experience with TASM consists of the three products: Teradata Workload Analyzer, Teradata Dynamic Workload Manager and Teradata Manager. Teradata Workload Analyzer was used to tap into the database query log and system tables. I used Dynamic Workload Manager for the data warehouse managers with which I could create rules that help maintain service-level goals. If there was a workload definition that was full of longer-running queries, the data warehouse manager can input a rule that virtually states to the system delay the next query if 20 similar ones are already executing. I used the third product Teradata Manager for real time monitoring of the Teradata Database giving me valuable performance information. Additionally, extensive use of ERwin modeling tool, both forward and reverse engineering of logical and physical modeling LDM/PDM, including Web publication of the enterprise warehouse models, with the associated metadata. I have extensive use of all NCR tools. I completed Teradata certification classes with Teradata Corporation with high scores. I completed downloads and migration test work on Teradata V6.2 upgrade to V12, creating warehouse test sandboxes, populating information into Erwin forward engineered tables. I haveTeradata experience and Teradata classes with V13.
My experience is extensive. I am a problem solver. I work hard to get the job done on time and in budget.
REMOTE TERADATA ARCHITECT
SUMMARY OF MY SKILLS:
Extensive TPUMP V12: Teradata: TTU V. 12 Release Definition Teradata Parallel Data Pump (Teradata TPump) is a data-loading utility that maintains the data in a Teradata Warehouse using update, delete, insert, and atomic upsert functions. Teradata TPump maintains near real-time data in a data warehouse when systems are too busy to devote a designated batch window to upload data.
TERADATA ARCHITECT & MODELER, with 25 plus years in relational database with very strong skills in architecture, logical to physical design, Creation of business objects including databases, tables and indexes for very large objects, and capacity planning. I have done extensive tuning and performance work. I have four plus yeas with business objects. I do checks on indexes, run explains to determine optimal database usage, check sql code for optimal performance and tune systems parameters. Systems include: SunOS5.6-Solaris2.6 & 5.7/2.7 Ultra1, Ultra5, Ultr60, E250, E450, E3500, E4000, E6000, E6500, Network Administration, systems installation and tuning/LINUX/X Windows, Motif -/BSD/AT&T UNIX/IBM AIX, OS Programming/Administration and Systems Programming, X-Terminals, Interface, VMS WEB SITES, IP Routing, Transaction Server 1.2 & 1.3 , Dump Analysis, DB2 Performance and Tuning Expert , DB2/2, DB2/6000, Client Server, IS Technical Manager, Teradata Administrator. VTAM/Network Systems NAS and SAN Storage systems. FIREWALLS, Raptor and Netserve, LEGATO, and clustering with Veritas.
Applications include: enterprise resource planning ERP and customer relationship management CRM systems. Use of ETL extract, transform & load to yield sales channels to understand buying behavior.
TERADATA v2r6.2, MAIN FRAME DB2 1.3 to 7.0 & SHARED DATA CONCEPTS, UDB/DB2 & CONNECT, SQL SERVER, ACCESS, ORACLE 6,7, 8, 8i, 9i & 10g with install experience and RAC, INFORMIX, SYBASE, DB2/400, UNIX DB2/6000, OS DB2/2, DB2/UDB, MVS/DB2, TERADATA, IMS, IDMS, M204, NATURAL, SQL SERVER, NOMAD, ARCHITECTSEIII, FOCUS, MS ACCESS
TERADATA UTILITIES: FastLoad, MultiLoad, TPump, QRYCONFIG, FRAMER, PACKDISK, QRYSESSN, RCVMANAGER, LOC, SHOWSPACE, TPCCONS, plus all support utilities such as FASTEXPORT, and Hummingbirds extract, transform and load ETL tool. MVS TOOLS INCLUDE: TSO/ISPF, CICS, PERFORMANCE MONITORS, BMS/MAINVIEW & ADMIN TOOLS ALL , PLATINUM ALL , DB2PM, QMF, OMEGAMON, Teradata Manager, Queryman, WinDDI, PMON, Oracle Procedure Builder, TOAD, SQL & SQL* Plus, Erwin, OEM, Visio, MS-Project, Brio, Visual InterDev 6.0, Exceed, MS-Access, Excel, Word, Power Point, Netmeeting, Front Page, Lotus Notes, NetWare, pcAnywhere, WinVNC, CHECKPOINT FIREWALLS, Raptor and Netserve, LEGATO, and clustering with Veritas.
SEPT 2011 to JAN 2012
IBM CORPORATION, GLOBAL PROFESSIONAL SERVICES
SENIOR TERADATA ARCHITECT
I was the senior (and only) Teradata ARCHITECT on this project for IBM at a major (the biggest) sea transportation client. I provided support for day to day ARCHITECT activities including DDL creation to support the warehouse and BO semantic layer. I worked on numerous performance problems including SQL code with High IO and high SpoolSpace utilization. I completed recommendations regarding monitoring tools and evaluation of space requirements. I was responsible for security administration and setups for new users and new applications. I was available and on call for any and all problems supporting their Teradata development, testing, integration and production. The warehouse was established in 3rd normal form with the semantic layer structured as a star fact, dimension architecture. I created numerous new tables and view at the semantic layer and supported the business objects access to the semantic layer.
DEC 2010 to TERADATA CORPORATION
AUG 2011 DEPT.OF DEFENSE, DEPARTMENT OF ARMY
HUNTSVILLE, AL. CBM PROJECT - BLACKHAWK
SENIOR TERADATA ARCHITECT, MODELING, PERFORMANCE
I was the senior database architect for this proof of principle (POP) Phase I presentation to the department of the Army. I had extensive utilization of the logical data model to design and implement the physical datamart. General Pillsbury had written a white paper and presented the idea that the Army could eliminate the current schedule based maintenance system and go to a Condition Based Maintenance (CBM) process. His plan was to gather all sensor data from the Blackhawk and through BI tools such as DataMiner, and use the standard deviation to determine when to do maintenance on the helicopters. I modeled and design the logical and physical objects to accommodate the data in a new physical datamart on a dedicated appliance, Teradata’s 2555. I assisted with the object creation and the ETL process to load the data into our BI warehouse. I was involved with the performance tuning of the ETL loads. This analysis required an in depth understanding of the logical data stores for the Blackhawk and Apache helicopters. We gathered sensor information from the many sensors located on the aircraft and provided BI DataMining information to scientists and field commanders. This information was vital for safety and commander combat readiness. This included all flight details (including flight and mission logs), inventory details and primarily engine and rotor status information. Conditioned Based Maintenance (CBM) or CBM-DW system fuses data from multiple sources and formats and integrates that data into a single database so that analysis across those sources can be efficiently and transparently performed. Those sources include Army Aviation data comprised of on-board Digital Source Collection (DSC) data, logbook data, mission related data and others. To the extent possible the CBM-DW database was designed to accommodate CBM data from non-aviation sources. I gave presentations and detailed briefings to the Army regarding the architecture and logical/physical models using Erwin as the modeling tool. Complex analysis included Oracle to Teradata transformations and data analysis. DataMiner was the tool used for the semantic layer to perform deep analysis of historical data in the first phase. In the second phase current data would be compared with the historical data to give detailed information to the engineers and commanders.
JUL 2010 to DEPT.OF DEFENSE, NAVAIR LOGISTICS; NAVY AIR SYS CMD
DEC 2010 NAS PATUXENT RIVER
My work included architectural review (logical and physical) modeling (3rd normal form with surrogate keys) and ETL code fixes (including work with DataStage)to their data warehouse called DeckPlate (DECision Knowledge Programming for Logistics Analysis and Technical Evaluation). My duties did include writing the ksh scripts and performance tuning them. This analysis required an in depth understanding of the logical data stores such AIIRS (Aircraft Inventory and Readiness Reporting System), CMIS and others which maintained initial data input to the warehouse for the global F18 status details. This included all flight details (including flight and mission logs), inventory details and primarily engine status information. Maintenance changes were made through the maintenance warehouse usually via a ksh (korn shell),then after validation, into the QA environment and finally on to production. These changes were the rolled down to the testing and development platforms so that all platforms stayed in sync. For the development changes this method was reversed to insure consistence between all platforms. I fixed many problem areas and worked in new design areas for the Dept. of the Navy. I gave presentation and detailed briefings to the Navy regarding changes to Teradata Deckplate. Complex analysis included Oracle to Teradata transformations and data analysis.
JUN 2010 to TERADATA CORPORATION
JUL 2010 NBC UNIVERSAL, LA, CALIFORNIA
I worked for Teradata Corporation for their client NBC, Universal in Hollywood California as a senior Teradata architect and performance specialist. I worked directly with Mr. Rockoff, NBC’s chief architect in evaluating the logical/physical model which was a basic snowflake design with fact dimension tables. In addition I did analysis on the “forklift” project for DB2 to Teradata release 12 and some input regarding the move to release 13 with its multilevel PPI support and other performance boosters. I provided performance analysis for the ETL process, Sql queries and batch process making recommendations to improve performance bases on comparisons of execution times between DB2 and Teradata. My recommendations included tuning an ETL process that was running in 28 hours to less than 2 hours. This was a complete redesign of the current ETL approach. I evaluated compression utilization and recommended the product tool: My recommendation was to purchase a tool from ATANASOFT which is easy to use and works very well at recommending candidates for compression in the warehouse. This is truly a productivity tool, user (ARCHITECT) friendly and reliable. Cost: Approximately 20k for the ANTANASUITE http://www.atanasoft.com/CompressTool/index.html
Further I worked very closely with Ms. Higgins of Teradata Corporation who was The co consultant with me to NBC on the Microstragity piece of the evaluation. Our recommendations at the systems level included parameter changes such as: There is a system parameter file on Teradata that may impacting overall performance. The file CLISPB.DAT has a set of default parameters. There are three that are very important to the system performance.
Excellent ratings from NBC Universal and Teradata Corporation.
APR 2009 to BANK OF AMERICA, CORP CENTER
MAY 2010 CHARLOTTE, NC
Senior TERADATA ARCHITECT on BofA’s Bank Data Warehouse (BDW) with extensive modeling and architectural design on the AML project and worked on real time with extensive use of TPump, (Teradata Parallel Data Pump) to keep the Warehouse updated in real time. My work included ETL scripts and extensive work with Data Stage. Extensive work with business objects. Responsibilities included 4 large environments, each with their own warehouse, in excess of 15 terabytes each with extensive logical to physical design. Maintained hundreds of database and DDL objects with extensive business modeling. Maintained store procedures for ETL and extensive use of Informatica tools. I was the go to ARCHITECT to approve code changes going from development to production. Extensive performance tuning work, explains, index hashing, AMP distribution of primary hashed indexes, Teradata SQL Assist, Teradata Performance Monitor, Erwin Modeling tool and 3rs normal form. I was the lead Teradata consultant for the large data warehouse, Base, to Dev., to Test and into Production. I have extensive use of Erwin, Teradata Administrator, Teradata Manager, Teradata SQL Assist 7.2, Teradata Performance Monitor, Extensive use of ERwin modeling tool and NCR tools. I have extensive use of ERwin, Teradata Administrator, Teradata Manager, Teradata SQL Assist 7.2, Teradata Performance Monitor, Extensive use of ERwin modeling tool, both forward and reverse engineering of logical and physical modeling LDM/PDM, including Web publication of the enterprise warehouse models, with the associated metadata. I have extensive use of all NCR tools. I completed Teradata certification classes with Teradata Corporation with high scores. I completed downloads and migration test work on Teradata V6.2 upgrade to V12, creating warehouse test sandboxes, populating information into Erwin forward engineered tables. I haveTeradata experience and Teradata classes with V13.
JUN 2008 to VERIZON COMMUNICATIONS
APR 2009 WASHINGTON, WASHINGTON DC
SENIOR TERADATA ARCHITECT
Senior TERADATA ARCHITECT on Verizon’s Business Applications with extensive modeling and architectural design work with extensive use of TPump, (Teradata Parallel Data Pump) to keep the Warehouse updated in real time. Extensive work with business objects. Responsibilities included 5 large environments, each with their own warehouse, in excess of 9 terabytes each with extensive logical to physical design. Maintained hundreds of database and DDL objects with extensive business modeling. Maintained store procedures for ETL and extensive use of Informatica tools. I was the go to ARCHITECT to approve code changes going from development to production. I was the lead Teradata consultant for the large data warehouse, Base, to Dev., to Test and into Production. Extensive use of Erwin, Teradata Administrator, Teradata Manager, Teradata SQL Assist 7.2, Teradata Performance Monitor, versioning tool ClearCase and change control tool ClearQuest. Use of PUTTY to access Unix, SharePoint, Excel 2003, and PowerPoint project management tools along with Outlook Express and SameTime. I have 15yrs of experience in Data warehouse environments and 30 years IT overall. I have 6yrs of Teradata Architecture, Design, and Implementation experience. I have Teradata ARCHITECT & development skills. I have senior data modeling skills. I have extensive ETL skills along with strong UNIX administration, data analysis and data warehouse operational knowledge. Extensive use of TPump utilities to keep the Warehouse updated in real time: with use of the recommended 3rd normal form for the architect and modeling of the Warehouse with extensive use of ERWIN modeling tool and NCR tools. Some use of the dimensional model: Dimensional Model is a logical design technique that seeks to present the data in a standard, intuitive framework that allows for high-performance access. It is inherently dimensional, and it adheres to a discipline that uses the relational model with some important restrictions. Every dimensional model is composed of one table with a multipart key, called the fact table, and a set of smaller tables called dimension tables. Each dimension table has a single-part primary key that corresponds exactly to one of the components of the multipart key in the fact table. This characteristic "star-like" structure is often called a star join.
SEP 2006 to EDS INTERNATIONAL
JUN 2008 EAST COAST & GLOBAL OPERATIONS
DB2/TERADATA ARCHITECT & MODELER
I was ARCHITECT & MODERLER lead/manager for EDS Global system primarily DB2 but also modeling and architectural analysis on Teradata projects. I was the lead database consultant for the many of EDS’s large data warehouse, in production. Extensive work with business objects. Expert in use of Erwin as a modeling tool, with extensive use of Teradata Administrator, Teradata Manager, Teradata SQL Assist 7.2, Teradata Performance Monitor, versioning tool ClearCase and change control tool ClearQuest. Use of PUTTY to access UNIX, SharePoint, Excel 2003, and PowerPoint project management tools along with Outlook Express and SameTime. Creation of business objects including databases, tables and indexes. I had primarily responsible for the data encryption and the PCI Security Standards at several installations. This was to allow for the ongoing development, enhancement, storage, dissemination and implementation of security standards for account data protection. I actively designed logical & physical databases and associated object for EDS at its diverse client sites, including USA, New Zealand, and Chennai, India. Lead a team of 8 Architects’ and maintained global support, 24/7 for client production and testing systems. This involved design and creation of data warehouse databases for DB2. I extracted data using Teradata ETL tools and loading of data. Maintained store procedures for ETL and extensive use of Informatica tools. Design and modeling with Erwin. I maintained backup recovery procedures and coordinate 24 hour response to both production and testing problems.
NOV 2005 to UNIVERSITY OF CALIFORNIA BERKELEY
AUG 2006 SF, BAY AREA
DB2/TERADATA ARCHITECT & MODELING,
Alchemist Change Control Manager
Modeling and architecture work on database systems. I was the single point of contact for all program migration into production. Creation of business objects including very large databases, tables and indexes. I made use of the change control tool Alchemist to administrate these changes. I was also responsible for TERADATA maintenance and performance tuning and extensive work with business objects. This included using my performance tuning scripts and making changes to the TERADATA systems to improve performance and thru put. I have letters of recommendation from the associate director for work well done on the installation of Alchemist and the take-on for the new processes implemented within the STU or student environment. I assisted with training and worked with Alchemists senior level install and change team. I reported directly to UCB top management regarding the critical installation of the new release of Alchemist 5.3.1.Assisted with education of staff for new features in TERADATA.
APR 2005 to IBM ACCENTURE
OCT 2005 SAN FRANCISCO, CA
DB2 ARCHITECT, MODELING AND ADMINISTRATION
Installation of DB2 midrange on over 25 environments with extensive use of ETL processes to keep the Warehouse updated in real time. Each environment supported as many as 4 databases warehouses. Creation of business objects including very large databases, tables and indexes. Primary responsibility for daily migration of DDL through the system via the use of scripts and some automated process. Worked extensively with the application coders to enhance SQL queries and improve performance. Responsibilities also include userid administration for DB2. I wrote many scripts to automate the process of administration of so many databases such as runstats, referential integrity checks, table space checks, etc. Worked with crontab table install automated backup process which ran every night on the databases. Applied configuration parameter changes for performance boosts. I worked extensively with the production architects and modelers to assist with design and performance issues. I worked on problem tickets, updated and maintained documentation regarding the DB2s configuration and architecture.
AUG 2003 to MERCURY INTERACTIVE
APR 2005 SUNNYVALE, CA
DB2/UDB, TERADATA DEVELOPMENT, PERFORMANCE TUNING
Work included performance tuning on AS/400 systems and Teradata systems with onsite work at various client sites to assist with performance tuning relative to capacity management and growth. I am fluent with the administration and database support required for AS/400 systems. Additional work included performance tuning for many fortune 500 companies including Duke Energy System. Work at Duke included resolution of many problems related to the DB2 subsystems at the systems level and at the applications level. My methodological approach to tuning insures success on all application. Large databases and robust applications are my forte. Duke was a big user of People Soft products, BMC and other DB2 tools. I fixed problems relative to SLOW CLOSE (DSMAX parm. too small) log waits (OUTBUFF too small) and many problems related to sequential prefetch and list prefetch. With my coded scripts, which I run on all systems I identified numerous problem with the applications and systems, one primary being that RUNSTATS had not been run on many objects. My UDB work included establishing the documents and standards for UDB monitoring and performance problem identification. I am an expert in UDB performance tuning.
JUL 2002 to TERADATA CORPORATION, US POSTAL SERVICE,
JUN 2003 ROCKVILLE, MD
MODELING AND WAREHOUSE ARCHITECT & SENIOR ARCHITECT:
I was the TERADATA modeling/ARCHITECT/architect for the US Government Services Division of NCR. Sensitive security clearance for the very large scope project at the USPS. Creation of business objects including very large databases, tables and indexes. Lead modeler/architect for the FLASH system and overall CIS systems at the postal service. This project was a warehouse DB2/Teradata effort requiring skills in midrange and mainframe applications. Data was ported from the DB2 database into Teradata. I was involved in the architecture and design phases from design, metadata layouts and modeling through data extraction DB2/IDMS/IMS/Oracle on multi-platform environments. As a senior member on the team I took a lead role in the information gathering and modeling effort. Extensive use of ERWIN modeling tool and NCR tools.
JAN 2002 BANK OF AMERICA,
JULY 2002CONCORD CA
DEVELOPER AND MODELER
I was a primary Teradata developer and modeler for the FORWARD project at Bank of America. This project was a warehouse DB2/Teradata effort requiring skills in midrange and mainframe applications. Data was ported from the DB2 database into Teradata. I was involved in the architecture and design phases from design, metadata layouts and modeling thru data extraction DB2 to production installation on Teradata, with some NOMAD. Extensive and sophisticated SQL scripts that exceeded 3000 lines of script BTEQ code to successfully move the data, and reformat it to the new platform. This included data from new acquisition banks in the north and east coast. Creation of business objects including very large databases, tables and indexes. The concept was to provide a uniform data repository for all of the banks information regarding customers and balances. Total number of rows well in excess of 200 million. True data mining and warehouse concepts were employed allowing for vast applications of data mining. Made use of BTEQ, TDC and QueryManager, Visio, and Metadata tools. Made changes in MVS/DB2 for addition to DB2 tables. This required COBOL programming, compiles, links, testing and walkthrough for production installation. Worked Martin Brown and Susan Perez on this installation. Wrote queries to yield sales channels to understand buying behavior.
MAY 2001 to NORDSTROMS,
DEC. 2001, SEATTLE, WA
DB2 PERFORMANCE CONSULTANT
My primary role was in design and development of the DB2 Warehouse at Nordstrom. Migration of Oracle and DB2 data to the physical warehouse tables in Teradata. Installation of DB2 databases on NCR platforms. I did performance and tuning of DB2 databases, backups and recoveries. Responsible for model development, logical to physical design of objects and creation of database structures. My work included extensive use of SQL and performance. Utilized indexes to enhance query select for rapid response time. Warehouse contained over 2 terabytes of data. Very strong MVS skills, extensive work with JCL and JES2.
June 2000 May 2001, D2K Incorporated, Teradata and DB2 Database:
Installation of Teradata databases on NCR platforms. Performance and tuning of TERADATA databases, backups and recoveries. I was in charge of management of migration effort to Teradata DBS V2R2 and installation of two 5100S and four 5100M NCR processors Implemented parallel servers for 24/7 access, greater availability and scalability. Implemented partitioning for large database structures. Extensive use of Oracle Enterprise Manager Tuning Pack & Change Management Pack. Director for the IT department and database administration, Databases Oracle was primary database included all platforms, spanning California, Singapore and Japan; I was responsible for all the IT duties at D2K. This includes the maintenance and support of 50+ servers and 100+ Workstations. Server platforms include 27 Sun servers ranging from Ultra 1 s to UE-450 s, 8 HP9000 servers ranging from A to K series machines, 2 DEC Alphas running VMS and DecUNIX, 4 IBM RS6000, and 15 NT Servers, IBM AS400 and P390. My duties include backup and recovery, security, firewall maintenance, purchasing, Database maintenance Oracle, Informix, Sybase, Redbrick, DB2 on all platforms and IMS, disaster recovery planning, capacity planning, resource allocation. In the past 12 months, I have grown the department 200 and improved system availability 500. Responsibilities also include maintenance of all internal networks, VPN connections, RAS, and Telephony, planning for future growth, expansion of our SAN NETAPP 740. Working from a multi-year, multi-million dollar budget, I am tasked with a 500 growth chart over the next 3 years. Duties also include all facilities management.
Mar-Jun 2000 INVESTOOL, Inc. TERADATA ARCHITECT & ADMIN, Database Administration, and Network Engineering:
Extensive use of UDB DB2 Advanced Security, database encryption, single sign-on services, and extended security features to provide a comprehensive security network. Installed backup recover scripts for Investools on the production Web servers. Used remote servers for backup. I worked as the primary UNIX administrator being actively involved in the purchase and installation of all equipment including the acquisition and installation of a T3 OC12 for network communications. Installed failsafe security system and generators to insure 100 uptime. Hired engineers to assist with growth of company. Worked on conversion of 20 servers and 150 workstations from Sun Solaris 2.6 to 2.7. Network administration and performance tuning. Responsibilities included raptor firewall, NIS administration and Send mail services. Support for Sybase application servers. Installation of vendor products.
Jan-Mar 2000 TIBCO Software, Inc. Oracle ARCHITECT/Unix SA and Network Engineer:
Installed backup recover scripts for Tibco on the production Web servers. Used remote servers for backup. Completed both hot and cold backup scripts with export/import logic and tape storage. Designed slice disk architecture for physical archive of data. Built the slices to allow for 8gig of storage allocated to each of the servers. This allowed for daily backup of data, weekly and monthly.
Nov-Jan 2000 yIPes, (A STARUP IPO) SENIOR UNIX ADMINISTRATOR, SYSTEMS
NETWORK ENGINEER & RELATIONAL DATABASE ADMINISTRATION:
Interviewed candidates for Unix Admin. Positions with this new startup, pre IPO, ISP broaARCHITECTnd, fiber optic communications company. Installed 4 new Sun boxes from the factory. Fully configured the systems to conform to needs of the users. Two were configured as Oracle database servers. One other and an NT box were configured as applications servers, the NT application server housing Remedy and other support applications, the other housing Portal, to be used as an application development box for billing with that work being completed by E&Y for yIPes communications company. One box had 1.5G of main memory, the others, 1G. All boxes had two 18G Sun drives and 400Mhrz processors. The two 450 boxes included 4 processors; the 250 s each had 2. One box had a 3D graphics card, the rest, a standard graphics package. I had only one Sun monitor and key board to install with but had to do most of the installations with a character mode terminal being shared with the NT box. The last 250 box to arrive was setup to be used as yIPes web server which we transfer to a secure site in Palo Alto after I completed installation, configuration and tuning of the system. With only two monitors available, the setup of the systems was difficult but not impossible. Added to the difficulty factor was the requirement that I placed each box into production and had to keep them there and up while doing the configuration on the new systems as they arrived. Basically I achieved this by booting the systems on the character mode terminal, then starting a session on the Sun monitor and telnet to the system using its IP i.e. telnet 10.0.0.50. As long as I kept this remote link alive, I could then disconnect the character monitor and hook it to the new systems as they came in. I resolved many problems relative to the UNIX systems being behind the firewall and outside. For internal, the node was 10.0.0.1 say for our router and externally 220.127.116.11 default router. My resolution involved mapping the 18.104.22.168 address to the internal IP of 10.0.0.50. This was done through DNS and the use of NetScreen. A particularly persistent problem was resolved by me by determining that the subnet was not mapped correctly between points of entry. I used the IPCONFIG hme0 10.0.0.50 net mask 0xffff0000 broadcast 10.0.255.255 up command to correct this problem. Then to take care of the routing the following was used: route adds default 10.0.0.1 1. On two of the Sun Solaris systems I downloaded the ssh software from sunfreeware.com along with many other packages such as Perl. ssh is secure shell, it allows for the access to a Sun system using encryption only. I setup the DNS to only allow only ssh users to enter behind the "wall", to insure security.
Jan-Nov 1999 PRUDENTIAL SECURITIES - MANHATTAN, UNIX AIX ADMINISTRATION/PROJECT
LEAD & DB2 DATABASE ADMINISTRATION:
DHCP, NIS, LDAP & PERL WEB DEV. programming and systems work including installation of current operating system. UNIX OS. Resolved problems on DB2/2 database relative to very large table structures containing the 7 year securities information for every Prudential transaction. AIX scripts, shells and programming with front-end GUI interface development in TUXEDO. SAP interfaces and utilities. Backups and restores using import/export. DB2/2 object manipulation, performance tuning, reorgs, index evaluation, data evaluation. Creation of business objects including very large databases, tables and indexes. Close interaction with customer base and Prudential s technical staff. Established standards and process flows to process the data from mainframe DB2 and export it to the servers. Application made available to Prudential agents 7 by 24. Configured bash, ksh, csh files and directories. Use of encoded and compressed files, plus all utilities.
1998-1999 DB2 IBM GLOBAL SYSTEMS-SOLARIS 2.6&7 ORACLE 7&8i:
Working in Houston with a fortune 100 client as an IBM support systems professional. INTERNET/HTML/JAVA scripts and shells programming. Networks, CICS, Netview, Databases SYBASE, ORACLE, NOMAD, DB2, IMS systems programming. TCP/IP,
CLIENT SERVER. DB2 systems a various vendor products including RP/Server Nomad , ViaSoft, Omegamon, Sybase Replication Agent. Installation of products with SMP/E and support for Y2K efforts. Excellent client references.
1998-1999 OS/390 Y2K TERADATA ARCHITECT & ADMIN/STORAGE MANAGER & DATABASE ADMINISTRATION, Visa International:
DB2/TERADATA DEV/ADMIN. CICS interface via transaction server and front-end GUI development. Storage management DFSMS, DFSMShsm, DFSMSdfp, DFSMSoam, DFSMSdss, DFSMSnfs, DFSMSrmm, StorageTek, EMC, TMS, time-traveling Y2K system product manager. HCD, ISMF, CICS Systems Programmer. Comfortable with all products, any platform. CICS Systems programming multi-region sysplex.
1996-1998 MVS/DFSMS/CICS/DB2 SYSTEMS PROGRAMMER & STORAGE MANAGEMENT, Hitachi,
State of Georgia Senior systems programmer:
Storage management DFSMS, DFSMShsm, DFSMSoam. Install started tasks, shoot dumps, alter tso logon procs, resolve user complaints, alter SYS1.PROCLIB/SYS1.PARMLIB, responsible for product installation - DB2 R310, DB2 R410, BMC, Omegamon twice , Platinum, DB2PM, Boole&Babbage, STROBE, etc. Investigate problems using IIN and pull and install maintenance - CBIPO/CBPDO/Electronic PTFs/APAR fixes. Program in Assembler/ASMA90, REXX, and CLIST. Personally responsible for managing systems containing state legislature, state revenue income and sales taxes , department of motor vehicles, and welfare.
1996 PROJECT MANAGER & DATABASE ADMINISTRATION, TERADATA MIGRATION, Coca-Cola / NCR:
Responsible for management of migration effort to Teradata DBS V2R2 and installation of two 5100S and four 5100M NCR processors. Budget is 8M. Total effort involved approximately 50 personnel, consisting of both Coca-Cola and NCR personnel. Responsibilities included network connectivity between all 5100 s and two Hitachi Skylines, including LAN/WAN utilizing Ethernet and TCP/IP. Project
was completed ahead of schedule, and within budget.
1996 TERADATA PERFORMANCE EXPERT, Trans-Union:
Analyzed and significantly improved processes, cutting costs and generating millions of dollars of revenue to Trans-Union. Site size: 9M rows spread over 12 AMPs. Am currently in the process of converting Teradata to MVS/DB2 R410. Extensive work with sql and scripts to allow for very fast response time. Fixed numerous problems with physical structures. Extensive work with catalogs to tune the system. Creation of business objects including very large databases, tables and indexes.
1996-1997 IBM GLOBAL SYSTEMS, OLYMPIC GAMES, Atlanta, Georgia
UNIX Admin and tuning senior consultant for DB2:
I was responsible for the design, development and implementation of the client/server environment for the games in Atlanta. Included modeling, referential integrity both DB2 enforced and application enforced. Use of both ERWIN and BACHMAN as modeling tools. Design allows for the tailoring of systems and performance to allow the data to be ported between DB2/2 OS/2 V1.3, DB2/AS400/,DB2/RISC6000, and main frame DB2. Primary focus was on the implementation in DB2/2, DESIGN AND PERFORMANCE TUNING. Assisted with implementation in the winter Olympics in Nagano, Japan.
1994-1995 STORAGE SYSPROG, DB2 Sysprog and DB2 ARCHITECT, Atl. Gas Light Co.
Storage management DFSMS/DFHSM:
Designed performance enhancements to production. Maintained DB2/CICS production application sized at 289M rows. Established the company s first volume test environment. Support 23 testing and volume performance environments totaling 1200M rows. Created REXX tools package for sysprogs.
1992-1994 DB2 Design Consultant, IBM Atlanta Lab:
Consultant to the labs as a design, performance and tuning specialist for DB2. Evaluated and resolved performance problems on the object oriented engineering package for IBM. I was the primary representative for the DB2 systems group at the labs representing them in the change management meetings. I wrote and maintained programs using REXX and C++ languages for both DB2 R230 and R310. Extensive systems programming work. Installed and tuned DB2 R310, providing performance design enhancements for the system and application. Established a volume test environment for the labs. Extensive design, logical to physical, static and dynamic SQL. Extensive work with remote DB2 systems and Distributed Data Function DDF . Utilities, migrations, and PTFs on a very large environment. Use of Bachman and IEW for logical/physical design. Extensive work on OS2 DB2/2, R/6000 DB2/6000. Client/server installation, backups, design and tuning. Ported code and data from DB2/MVS to R/6000.
1991-1992 DB2 Level 2 Consultant, IBM Santa Teresa Labs:
Consultant to the lab in California working with DB2 internal code. My primary focus was on DB2 attach mechanisms including CICS, IMS, and TSO. I worked closely with the IBM DB2 ARCHITECT & ADMINs on enhancements to internal DB2 code. I managed 10 consultants in the development efforts, PTF design and implementation, and customer support. Worked directly with IBM s largest customers Department of Defense, Citibank, AT&T, etc. nationally and internationally, to install the new release of Beta R310 from the lab.
CLIENT LIST, LAST 23 YEARS
Bank of America,
CITI Group, in Mexico,
Monterey Savings & Loan
Coca-Cola/Teradata , Standard Oil of California
Ford Aerospace, Sprint
Stanford University, Monterey Savings and Loan
Pacific Bell, Westinghouse
Apple Computer, Pacific Gas & Electric
AT&T, City of San Francisco
Atari, Bank of America
FMC, American President Lines
PhD, Psychology, Walden University, candidate GPA 3.8
PHI CHI, National Graduate School Honor Society
MS, Computer Science, San Francisco State University GPA 3.9
BS, Computer Science/Psychology, University of Washington GPA 3.75
Jurist Doctor, Law Doctorate, Howard Taft University, 3nd Yr. GPA 4.0
Dale Carnegie, courses and awards.
Best in Class Presentation Award.
US MARINE CORPS, 26TH EXPEDITIONARY FORCE,
1st Marine Division, Vietnam, COMBAT DECORATIONS.