Information/AI Architect, Modeler, & Data Scientist @ DBmind Technologies
MDM/DWH Information Architect/BTA @ Novartis
Big Data Scientist & Information Architect @ DBmind - Other Consulting & Research
BSC, Computer Science & minor in Accounting @
University of Regina
Robert is an Information/Rules Architect who is “hands on” with Service/Data Modeling while applying the Industry Standards. Robert is a specialist in the architecture, design, & development of Cloud based SOA Web/Restful Services for MDM/Data Warehouses with Business Intelligence and Business Rules/BPM integrations. Unstructured, Semi-structured, & Structured Data integrations with taxonomies, ontologies, vocabularies and Canonical/Logical/Physical models are
Robert is an Information/Rules Architect who is “hands on” with Service/Data Modeling while applying the Industry Standards. Robert is a specialist in the architecture, design, & development of Cloud based SOA Web/Restful Services for MDM/Data Warehouses with Business Intelligence and Business Rules/BPM integrations. Unstructured, Semi-structured, & Structured Data integrations with taxonomies, ontologies, vocabularies and Canonical/Logical/Physical models are his speciality. The Data Sciences have been his focus since graduate studies where Robert has applied techniques from Machine Learning, Artificial/Computation Intelligence, Big Data, and Data Mining. Robert is a pioneer in the application of Computational Intelligence for Financial Engineering with emphasis on Advanced Algo Trading strategies and Risk Management. Robert's research abilities coupled with his work experience, give him an outstanding ability to evaluate and apply new technologies and products. Robert has over thirty years of experience in designing, developing, and maintaining information technology systems while applying the needed information governance mechanisms. Project management, team leadership, and mentoring have been an integral part of Robert's project work. He has an extensive background with operating systems, communications, databases, and the internet. Robert’s domain knowledge cuts across the Financial, Pharmaceutical, HealthCare, insurance, Energy, High Tech, and Agriculture industries.
Data Sciences - Information Architecture – Modeling(BPMN, Taxonomy, Ontology, Vocabulary, Canonical, Logical, Physical) – SOA – Web/API/Restful Services – Industry Standards – Data Mining - Computational Intelligence for Financial Engineering – Trading Systems – Risk Management – MDM/Data Warehouses/EDW - Business Intelligence – Business Rules/BPM – the internet/Semantic Web – Cloud Computing – Big Data – Machine Learning
Information/Rules Architect, Modeler, & Data Scientist @ Robert's Consulting Assignments:
CCAR Information Architect & BPMN Modeler
Enterpise Data Architect & Modeler
Other Industry Consulting & Research (Jul'14-Dec'14)
Data Scientist/Information Architect and Modeler
Data Scientist/Information Architect and Modeler and Rules Developer
Credit Suisse (May'11-Dec'11)
Client On-Boarding Service Modeler and SOA Architect
Credit Suisse (Jun'10-May'11)
Equity Shared Service/Trading Technologies - Information Architect and Modeler
Credit Suisse (May'09-May'10)
Trading Horizontal - Prime Services Product Line Rep and Information Architect
XBRL & LegalXML (Jan'09-Apr'09)
Information/Rules XML Architect
Credit Suisse (Oct'08-Dec'08)
Capital Markets Information Architect/Modeler
The Bank of Tokyo-Mitsubishi UFJ (Mar'07-Apr'07)
Mortgage Securities Information Architect/Modeler
Operational Risk(Basel II) Information Architect/Modeler
Bessemer Trust (Jun'02-Feb'03)
CRM ETL Information Architect/Developer
Data Synapse & SOA/Digital Evolution (Jan'02-Dec'03)
SOA/Grid Computing Architect/Developer
JPmorgan – (Dec'99-Dec'01)
Global Credit Risk(Basel II) Data Warehouse Architect/Developer
Deutche Bank Securities – (May'97-Nov'99)
Fixed Income Middle Office Information Architect/Developer
Data Mining/Warehouse Architect/Modeler
Data Warehouse Architect/Modeler
Technical Project Manager and Data Architect From January 1995 to Present (20 years 10 months) Enterprise Data Architect & Modeler @ Evaluated two MDM (Master Data Management) vendors, IBM InfoSphere & Informatica Siperian(MDM) while making a recommendation. Mapped the Canonical Data Dictionary to the MDM “to-be” mastered & survivorship rules. Created a BPMN model with No Magic's Cameo Business Modeler for the Data Governance of their Canonical Model aligning with their Enterprise Information Architecture. Investigated the ACORD Capability Model taxonomy at the process/activity level for BPMN adoption into Sparx Enterprise Architect for a Risk Management component of RSA Archer. Examined the Operational Risk Activity/Controls(PRINCE2/SOX/COSO) for possible SIFI alignment - Systemically Important Financial Institutions. Created the initial Service/Data Governance Modeling Standards & Guidelines with emphasis on JSON, Agile MDA/Model Driven Design(MDD), & Restful/API Services for hosting in Systinet/Swagger. Participated in the Service/Data Governance meetings while enforcing the standards and guidelines established. Reviewed the taxonomy meta data for tagging content/unstructured data for IBM's WCM/Watson's/Google Search Engine and Cognizant's DAM/AssetSERV & GAP (Google Analytics Prime). From January 2015 to June 2015 (6 months) Data Scientist/MDM Architect/Modeler @ MDM Architect/Modeler:
• Reviewed the current state architecture for a Global Wealth Management Prospect/Client Profile System while looking for gaps in their Information Architecture. The main focus was on their Party/Product data with Account linkages. Lack of Governance was a major inhibitor for their systems operations. Data Quality and performance issues needed to be addressed. The Client annual mail out was too onerous for example. FINRA/FDIC/Dodd Franks Reg requirements were examined. The system was Sybase/Java based. Numerous future state options were examined one of which was to use Informatica's ETL/MDM platform. Power Designer was the modeling tool used.
• For a Health Care Provider - created the MDM Strategy, Reference, Roadmap documents integrating(Agile) Talend MDM, Collibra, Biztalk, Tableau, SSIS-MS SQL Server/Pick DB, Salesforce. Investigated the ICD 9/10 & HIPAA/HITECH/PHI reg requirements. Integrated the Biztalk HL7/RIM/XML canonical mappings while researching data layouts for Medicare, Medicaid, FIDA, CMS, UAS-NY/Assessment, DOH, LexisNexus with focus on member, payer/provider, claims, & clinical data. Integrations were with QNXT, Medecision, McKesson, Lawson. Other standards examined: OpenEHR, CDISK, SOA, W3C, OMG.
• Investigated a Decision Support System optimizing appliance claim/repair and scheduling for a Home Appliance Insurance/Warranty company. Root-cause analysis combined with cost efficiency were incorporated. An in-house Rete Algorithm was compared to a JBoss Rules approach. Future recommendations were proposed.
• Researched and reviewed numerous papers applying Topic Modeling to Big Data and Data Visualization and a paper integrating Fuzzy Logic with XBRL.
• Investigated a CART Decision Tree system for a Mortgage and Consumer Loan Servicer to analyze and predict loan prepayment & default.
• Researched a rules based CBR solution to a financial problem.
• Investigated IBM Watson's applicability to a financial problem. From July 2014 to December 2014 (6 months) MMD Data Scientist/Architect & Rules Developer @ This project with Merck's Manufacturing Division was the "first of its kind" where millions of documents and data sets can now be auto tagged to enable google like searches that could save millions. Ontologies/Taxonomies were created and integrated with a set of business rules for classifying structured, semi-structured, and unstructured data while optimizing tagged search capability. Text mining with NLP(Natural Language Processing) algorithms were used to enhance the classifying rules.
•Set up the Principles, Standards & Guidelines for Ontology & Auto-classification Business Rules creation and optimization through reporting metrics.
•Established the governance and change management processes for Taxonomy and Rules development.
•Integrated the Smartlogic Semaphore tool suite to establish the first production release of MMD's Technical Knowledge(TK) Taxonomy for API (Active Product Ingredient) and DP (Drug Product) for auto classification of their structured, semi-structured & unstructured data.
•Domain Expert interviews and exercises were combined with text mining to drive out Term Relationships for Hierarchy, Association, & Equivalence while investigating Industry Taxonomy integrations.
•Sharepoint integrations were established for auto classification, navigation, and search optimization with Metavis, Kapow, FAST, Documentum, BA Insight.
•Other NLP (Natural Language Processing) tools like I2E were investigated for rule text mining capability and taxonomy development. Topic Modeling was researched for applicability.
•Designed the Smartlogic CRT auto classification metrics report developed in Groovy. Smartlogic is based on Python. Examined Smartlogic/Marklogic NoSQL integrations.
•Business/Functional Specs were vetted aligning with the test scripts used for production deployment.
•Aided in the integration of a Structured(DDL)/Unstructured Smartlogic model interface with Data Virtualization(Composite) and Big Data considerations with Hortonworks/Hadoop. From August 2013 to June 2014 (11 months) PA,NJMRL Information Architect & Modeler @ This project with Merck’s Research Labs created the EngageZone.Merck.com award winning platform which supplies a safe haven for external partner collaboration and massively improves the partner on-boarding experience. We intergrated a cloud based MDM system with real time communications through Web Services. An Intellectual Property Risk Model and Strategy was developed to protect sensitive information.
•Created the initial “Data at Rest” Logical/MDM Model via Erwin and the “Data in Motion” Canonical Model with XMLspy. Put into Production the MDM/Web Service integrations while coordinating code generations and testing. Blueprint, Quality Center, SoapUI, PowerDesigner, and Eclipse tools were used with Sharepoint/Intralinks/Aspera.
•We interfaced many systems with our IA Landscape which had a Big Data theme with considerations for Social Media and the Clinical Trial Exadata. Advanced analytic tools were investigated through tools like Microsoft Dynamics.
•Worked with the initial data landing and publish model with emphasis towards a Data Warehouse/ODS.
•Defined the vocabulary/ontology/data dictionary/BPMN wIth the needed governance/tooling mechanisms. Investigated many tools such as Semaphore, Anzo, Protégé, and Adaptive.
•Created the Master/Solution plans integrating the Enterprise/Information Architecture while aligning with the ITIL/TOGAF standards via Sparx Enterprise Architect/PowerDesigner.
•Worked with the Business Analysts to drive out the Liaison MDM/Web Services Requirements and Functional Specifications with the SDLC/Agile methodology and compliance procedures for QA/production releases.
•Created the IP Risk Model with considerations for DRM(Rights Management), Encryption, Authentication, and Monitoring for proper sensitive content handling.
•Created the “Data in Motion” Modeling Standards and Guidelines with Industry Standard alignment (HL7(RIM)/OAGIS/ISO/CDISK/W3C/OMG) and WSDL,SOAP,XSD integrations for Web Services and Restful APIs with Apigee. From January 2012 to July 2013 (1 year 7 months) NJ,PAClient On-Boarding Service Modeler and SOA Architect @ The Client On-boarding incorporates the SOA framework for services incorporating the Derivatives based ISDA/CSA contracts for KYC, Credit, Contract/Agreement Management, Collateral, and Account Setup with reference data integration. Common services and content/data models were merged across Equities, Derivatives, Fix Income and Prime Services COB platforms for Cash and OTC products. A target architecture/blueprint was defined while researching the use of Archimate. This was the first Web Service put into Production integrated with their cloud based architecture.
•COB integrations with their Business Process Models(BPMN/Aris) and UML Component Diagrams were cross referenced while aligning the services to the Conceptual Model’s high level entities/complex types.
•Drove out the COB SOA Business Requirements and Functional Specifications
•Service Identification with Business Rules and content model adoptions with Oxygen, Sparx Systems Architect, and conceptual models with Erwin. UML Primary Use Case models were vetted.
•UML Class Diagram representations for our WSDL, SOAP, and XSD were integrated with XMI.
•Service Testing tools were investigated – Junit, SoapUI, Parasoft, and iTKO Lisa.
•Eclipse/Helios with CXF based SOAP over HTTP/JMS code generation for Model Driven Architecture.
•All Services, WSDL(s), and XSD(s) were registered with WS02 Service Registry for SOA Governance.
•Architecture discussions were hosted for the proposed changes needed for FACTA/Dodd-Frank Act/MiFID. From May 2011 to December 2011 (8 months) Equity Shared Service/Trading Technologies - Information Architect and Modeler @ Incorporated a SOA framework for ultra low latency multi-asset-class trading. Common services and content/data models were derived for the Equities, Fix Income and Prime Services electronic/algo trading platforms for Futures, Options, Foreign Exchange, Stocks, Bonds and Swaps. Visual Studio was used to design UML Entity/Class diagrams and UML Activity/Sequence/BPMN Diagrams with XMI, M1, and M2 alignment. Authored their modeling and code generation XML Standards and Guidelines with linkages to W3C, OMG, OSGi, Oasis, FIX, FpML, EMF eCore, JETS, T4, Jira, SVN, Eclipse, TeamCity, ISO 20022, Swift, Binary Wire Format.
•Target Architecture/Blueprints were specified with Business Requirements and Functional Specifications.
•Modeled their FIXatdl based Algo Trading strategies for FX, Equities, Futures, & Options.
•Modeled their Reference/Market Data, Quote Execution, Market Making, Order Execution for Multi-Assets. Bloomberg, Reuters, Market Axess, MTS, TradeWeb, ICAP, BondPoint were integrated along with Pricing Data. Visual Studio, XMLspy, Progress’s DXSI, Erwin, Eclipse etc. were used to model.
•Responsible for the Equities, Fixed Income, & Derivatives XML/UML modeling initiatives incorporating the previous guidelines set on the Cash Securities Data Store project with Data Warehouse/MDM integration.
•Modeled the SOA messaging for the first Fix Income Algo Trading Platform - Onyx - handling Pairs.
•Low latency messaging via Informatica 29West with UML/XSD models aligned with the FIX/FpML industry standards. Solace was investigated along with FIX’s Binary Wire format. Xpath was used for mapping.
•Exchange Links, ECN, SOR, OMS, EMS, Algo Engines all leveraged our Ultra Low Latency Message Bus
•AutoQuoter and AutoHedging and Order Matching were examined, Teradata’s Trading relational model was reviewed. Xerces and XSLT and UML 2 Activity Diagrams were applied.
•Participated in the Hadoop Tuesday calls with Informatica, Cloudera, and Gartner et al. From June 2010 to May 2011 (1 year) Trading Horizontal - Prime Services Product Line Rep & Information Architect @ The Prime Services Trading Horizontal Project incorporated the analysis for a SOA framework for low latency multi-asset-class trading. Common services and content models were fused for the Equities, Fix Income and Prime Services electronic/algo trading platforms for Futures, Options, Foreign Exchange, Stocks, Bonds and Swaps. This includes Prime’s Listed Derivatives business with 6000+ external clients trading 40+ Derivatives markets globally.
•SOA Business Requirements Documents and Functional Specifications were derived with SOA Architecture considerations.
•Trading Horizontal Prime Brokerage integrations were made for Sec. Lending, Risk and Margining.
•An Event Processing Symposium (ESP) was reviewed with talks from Gartner, Sybase, Progress, etc.
•A Global High Performance Message Routing was analyzed with PrimeTrade and the Exchanges
•Modeled with UML New Order, Amend and Cancel Order flows with Algo Strategy Considerations.
o-Strategies for Plain Vanilla Futures/Options and Multi Leg Orders, Packs and Bundles.
o-Butterfly, Condor, Calendar Spreads, Straddle, Strangle, Vertical and Horizontal Spreads
•Coordinated Prime Services Applications analysis for Service Identification with managed interfaces.
o-Reviewed TOGAF and Zachman with the analysis and mapped the Business Processes(BPM)
•Chaired the Trading Horizontal Service Modeling working group while proposing a Service Modeling Strategy for XSD, XMI, UML, WSDL, Soap, and WS-* (Reliable Messaging, Security, and Addressing)
•Examined the various order execution payloads with Agora, AES, Fidessa, and the Exchange Links. From May 2009 to May 2010 (1 year 1 month) Information/Rules XML Architect @ Researched a .Net/C# XBRL Rules engine environment for XBRL reporting. Participated in a Legal Business Rules initiative with Haley’s Office RulesBurst & LegalXML specific to Export/Import Laws. From January 2009 to April 2009 (4 months) Information/Enterprise Architect @ Global XML Guidance documents were developed to ensure a Web Services (SOA) framework was done consistently while evaluating a canonical model/repository. FI, Equities, Derivatives, and PB high/low latency transport considerations were needed while enabling numerous payload layouts across their OMS, trading services, Validation/Enrichment(VnE), settlement systems, pricing/positions, and P&L. Data Warehouse integration points were initiated specific to their Cash Securities Data Store. Distributed data caching with Coherence/Gemfire were evaluated. The W3C, FIX, FpML, MISMO, XBRL, ACORD, MDDL, AMQP, OMG, ISO, SWIFT standards were examined for possible adoption into the common message models
•A Front to Back Office Messaging Strategy Scope Document was formulated for Agora, Fidessa, Interwoven front/middle office system interconnects with a Data Warehouse and XML.
•Progress/IONA and Tibco/EMS were used for "pub/sub" "request/response" messages with SOA Java/C# XML integrations. From October 2008 to December 2008 (3 months) Capital Markets Information Architect/Modeler @ Created a Information Architecture Blueprint/Reference Model for the SOA/WS-* integration. Authored the XML Standards and Guidelines with W3C/FIX/FpML/Acord. A XML Capital Markets Vocabulary/XSD was created for "pub/sub"/"request/response" message patterns with TIBCO EMS/MQ/Java/XML with JAXB Object Binding. XMLSPY/Erwin was used to model/create the message/database systems. All the transactional/reference meta data was persisted to Oracle 10g with Abinito/Toad. Designed the Trade, Security, Position Data Warehouse. FI, Equities, PB, Mortgage, & Loans were modeled to the trade lifecycle with fees, settlements, commitments, securitization, allocations, P&L. Performed ILOG XOM/BOM BRMS mapping with Protégé ontology interconnects. Modeled their Trade Compression, Execution Cost, and Account eSales initiatives with transactional/reference data interconnects for customer tracking of their high net worth clients. Built their WebLogic based XSD Repository. Coherence and Gemfire POC was done. From May 2007 to September 2008 (1 year 5 months) KYC Architect/Developer @ Designed, modeled, and developed their initial KYC database and UI interface for their individual, legal entity, and correspondent questionnaires for auditing purposes. Sharepoint Services was researched. This was designed for a SQL Server interface while using Access 2000 and Excel based interfaces. From March 2007 to April 2007 (2 months) Mortgage Securities Information Architect/Modeler @ Created the Information Architecture Blueprint/Reference Model for the SOA/WS-* integration. Authored the XML Standards and Guidelines with W3C/MBA/MISMO/FpML. A XML Mortgage Securities Vocabulary/XSD was created for "pub/sub"/"request/response" message patterns with TIBCO EMS/MQ & Java/C# with Hibernate. XMLSPY/Erwin was used to model the message/database systems. All transactional/reference meta data was persisted into Oracle 10g/DB2 8.1. Modeled the processes for the Enterprise Mortgage Integration (EMI) Hub & MISMO eMortgage from Loan Origination to Securitization. Implemented the XSD mapping to SAP, CVMS, MERS, LP, DU, Core Logic, Dorado, Lydian, & NetOx. Implemented the TradeWeb SOAP 1.1 mapping while researching the FpML/FIX interconnects. Integrated ETL/OLAP with Tedi, Business Objects and Business Rules with ILOG. Meta Data Business Rules were researched with Active BPEL(1.1 to 2.0 process execution). BPM/BRMS engines were investigated with products from ILOG, Haley, and IBM(MQ Workflow), and the Websphere 5.1.1 WASD tool suite. Rational UML was used for the sequence diagrams and BPM modeling integrated. Managed TCS/India on/off-shoring initiatives.Established linkages to the Corporate MDR Reference Data. Built their XSD Repository with TIBCO Canon. From January 2005 to February 2007 (2 years 2 months) Operational Risk(Basel II) Information Architect/Modeler @ Created their Information Architecture Blueprint and Reference Model for the Basel II Operational Risk award winning software program called Phoenix. Modeled their Back Office operations with Erwin/XMLSPY for Losses/Gains, Self Assessments/SOX, Capital, and KPIs. Calculated the Capital with Fitch’s OpVantage using a Monte Carlo algorithm. Designed the Litigation Matters interface and processing to the Data Warehouse. Researched an Operational Health Indicator (OHI) with a Fuzzy/Rough Set application. Designed/implemented the Oracle based Data Warehouse Information Architecture with ETL/OLAP tools Informatica RT and Business Objects. Helped setup the standards for the Data Warehouse Center of Excellence. Managed and audited many of the programming tasks which were on/off-shored to WIPRO in India. From March 2003 to December 2004 (1 year 10 months) SOA/Grid Computing Architect @ •Implemented a Web Services Management POC with SOAP interceptors via Apache for SOA Inc.
•Audited the design of a Data Warehouse which implemented OLAP with MicroStrategy and the Oracle App Server for a dot com startup called Vital Health. Converted this from JBoss, Enhydra, and MySQL.
•Aided the pre-sales efforts while integrating Grid/HPC (high performance computing) software for various clients for DataSynapse. An Open Source solution with StarFish was researched. From January 2002 to December 2003 (2 years) CRM ETL Information Architect/Developer @ Erwin was the meta data tool used to model and create the database systems with constraint handling. All the Erwin based meta data was persisted into Oracle 9i with Erwin’s DDL creator. All low level mappings were housed in excel spreadsheets while Siebel’s Data Dictionary was reverse engineered into Erwin.
•Designed and implemented the population and construction of the CRM Seibel based staging area and Data Warehouse while using the OLAP tool Cognos with a PeopleSoft interconnect.
•Researched and designed many aspects of Wealth Management and Customer Relationship Management (CRM) into the architecture.
•Modeled and coded many of the ETL procedures with Erwin, Perl, XSLT, Oracle, and dot NET. From June 2002 to February 2003 (9 months) Global Credit Risk(Basel II) Data Warehouse Architect/Developer @ Revised the ETL Architecture Blueprint and Reference Model. Popkin’s System Architect, Erwin, Rational UML meta data tools were used. Implemented the Basel II Credit Risk Back Office operations for Authorization and Facility Management for 009/reg, methodologies, ratings, derivatives, currencies. Calculated credit exposures for Single Name, Short Term Book, Region, Industry, and Country Risk. Designed credit exposures to break out into Issuer, Equity, Counterparty, Contingent, Backstop, Settlement, and Clearing with a Trade Book interconnect. This included Expected, Peak, Gross, Notional, and Negative exposures. Helped with the EVA engine via Robert Merton's equity/grids he designed onsite. VAR calculators were key components of our Risk Engine. Designed the Sybase DWH with the Sagent/Perl ETL tool while providing the OLAP with MicroStategy/Cognos/Business Objects. Prepared a proposal for Credit Risk Notch data mining with Rough Sets and Microsoft’s Analysis Services. From December 1999 to December 2001 (2 years 1 month) Fixed Income Middle Office Information Architect/Developer @ All market/transactional/reference meta data was created with Oracle Designer/Erwin while integrating the ICI/ADP, SWIFT, Rolfe&Nolan, Bloomberg protocols. Aided in the creation of a Data Architecture Blueprint and architected a Web Based Middle Office DWH/ODS for the Fixed Income Front to Back Office Trading and Settlement Systems. Trades routed from the front-end system to the service bureau (ICI/ADP) for books/records using MQ. Batch processes reconciled the positions, fails, GL.Coded in C++ the Bloomberg front-end interface for trades, securities, and pricing. Participated on the startup of FpML and kept up with the FIX/FPL initiatives. Designed a SWIFT interconnect while being cognizant of the current SWIFT XML initiatives. Supported and developed Oracle PL/SQL triggers, procedures, and packages. Setup a Web Based Knowledge and Change Management repository. Tested the Y2K/DR site for the Capital Markets, Fixed Income, and Equity Trade systems. From May 1997 to November 1999 (2 years 7 months) Web Server Architect @ •Prepared an E-commerce proposal for developing their Web Server in 3 phases which included the setup of the intranet/extranet with a defined methodology.
•Created a high level BPR analysis document of their back office operations.
•Proposed using Java, JDBC, SSI, Novell, AS400, Pershing, NSCC, NT, IIS, Apache, MySQL, and Perl. From August 1999 to October 1999 (3 months) Data Mining/Warehouse Architect/Modeler @ PRISM and Erwin was used as the meta data tool to build a star schema designed model. Data Visualization meta data was defined and stored in SGI’s Mind Set tool suite.
•Modeled a dimensional customer data warehouse in Erwin while integrating the OLAP tool Cognos Impromtu and Pyramid PRISM with the techniques from Inmon, Kimball, and Zachman.
•Modeled and constructed a data visualization data mart in Erwin while using the SGI's Mineset tools (Scatterviz/Treeviz) to visualize Chase's CD product information for campaign management.
•Created the project plan and identified the business processes required for the production environment. From December 1996 to April 1997 (5 months) Data Mining/Warehouse Architect @ Prepared a VLDB data warehouse/mining proposal integrating the ETI-Extract Tool Suite product. Robert’s Knowledge Discovery Methodology described in his Master’s Thesis from the University of Regina was used in the proposal for the data mining of customer profiles. From October 1996 to December 1996 (3 months) Data Warehouse Architect/Modeler @ Kimball’s techniques were used to model the second phase of the dimensional customer data warehouse in Erwin. Robert optimized the design of the Data Warehouse and communications in an Oracle, Unix(HP), TCP/IP, and Platinum Business Objects environment. An OLTP system(1400 tables) was reverse engineered from Data Workstation into Erwin for the data warehouse. Extract Tool Suite was implemented and Rational Rose researched. From June 1996 to November 1996 (6 months) Technical/Data Architect @ Modeled an AFE (acquisition for expenditures) system in Erwin/S-designer using E&Y's AFE & PPDM tables. Architected the system using an Excel 5 Windows(Visual Basic) linkage to an Oracle 7 Unix Server. From January 1996 to May 1996 (5 months) Calgary, Canada AreaTechnical Project Manager and Data Architect @ Modeled with Erwin/Deft (mac) the business processes/ERD for a Corporate Data Repository and Data Warehouse. Project Leader/Systems Analyst for the development of a Client/Server Data Warehouse in Sybase/4D/Forte with TCP/IP-DECNET on Dec Alphas. Managed the department's meetings and monitored the administrative processes. From January 1995 to May 1996 (1 year 5 months) Calgary, Canada AreaResearch&Systems Analyst, DBA, & Developer @ Infrastructure and Technical Support Group
Project Lead for RDBMS evaluation (Sybase/SQL Server vs Oracle). Completed Oracle’s DBA and Tuning course. Evaluated SQL GuptaWindows & Oracle Forms products for Oracle 7 using the Navigator methodology. Maintained three Oracle 7 OS/2 servers with numerous Windows clients interconnected by SQL*NET. Maintained and developed many C/Cobol/assembler systems as a systems analyst with DEC/IBM(MVS,CICS,RPG,Syncsort) Managed and taught courses on RSX, SQL, and Software Metrics (Function Points). An amdahl was leased.
Developed jointly with the UofR the first prototype expert system using Level 5 and C/C++. Used Micro Focus Cobol. From September 1986 to December 1994 (8 years 4 months) Regina, Saskatchewan, CanadaSystems Analyst & Developer @ At the “Alberta Research Council(ARC)” during “The Hail” project – when I worked as a summer student – the specially equipped Intera planes for “Cloud Seeding” were piloted by Intera. I lucked out again on another project with the Intera team where other planes were equipped with sonar/radar equipment similar to the sonar equipment mapping a seabed. While the plane flew at a constant altitude and speed it was able to capture the landscape it was flying over on sonar/radar tape. The "High Density Digital Tape (HDDT) was then converted to Computer Compatible Tape (CCT) via a PDP computer with our in-house developed software. This data was then data visualized and analyzed. This was hard core "Data Science/Big Data" in its earliest form. This was an incredible achievement at the time before full access to satellite data was available. The Coast Guard Ice Cutters in the Arctic Ocean could be advised on where the thinnest ice was to clear shipping lanes or to find the easiest way through the ice to free a stuck ship. On the island of Java the jungle is so thick it is hard to tell where the best place was - to build a bridge or road. With this technology the “road engineers” could be aided in these situations. On the project I performed the system manager duties for UNIX/RSX while configuring the hardware and OS Gens. Developed in assembler/C the variable send and receive directives for the RSX 11M OS which was used to optimize the tape translation speed. Developed in assembler a system to transfer radar data from HDDT tape to a CCT tape via a DMA device. I was also involved in the development of four programs in C: (1)cloud analysis program; (2)hail analysis program; (3)VOR/DME program for aviation; (4)program to interface a plane's sensor information with dBase III. I even helped wire wrap a board that was needed. I will never forget my trips up to Canada's Arctic - Inuvik & Tuktuyaaqtuuqt. There is a beauty and peacefulness up there which is hard to describe. From February 1986 to August 1986 (7 months) Calgary, Canada AreaProgrammer Analyst & Developer @ During our orientation we meet Don Lougheed (yes - Peter's brother) and I still remember him saying Esso's name is a trick on the initials SO for Standard Oil. He also mentioned that Alberta has more oil then the Saudis but that our oil was of a lower grade and harder to extract and process. At Esso I had the privilege to simulate their heavy crude oil extraction processes with steam injection. I was aided by similar "Simulation and Modeling" course assignments I did at the UofR with the Xerox Sigma 9 machine with the GPDS language. Thank you Dr. Law! At Esso we used IBM's based GPSS which code generated PL/1. FYI: GPSS also influenced SIMCRIPT and SIMULA too. With the steam injection simulation we were able to forecast the recovery volumes while optimizing the reservoir extraction processes. I also developed two PC based databases with RBASE. The simulation project was also ran with MVS, CICS, JCL, RPG, PL/1, and COBOL. I also enjoyed organizing the IT Christmas Party and hanging out with my orientation buddies: Bernie, Gary, & Brad. From September 1985 to January 1986 (5 months) Calgary, Canada AreaSystems Programmer @ I'm proud to have worked with the legendary "Saskatchewan Wheat Pool" which historically helped revolutionize the agriculture business and Saskatchewan’s great heritage. This was a farmer's cooperative - when in its heyday – almost every town had a grain elevator. Farmers hauled their grain to these elevators while buying farm supplies. The elevator was part of the town as much as the post office or hardware store. The elevator's grain would unload to a grain train car where it was transported and sold locally/globally. Every elevator had a computer which acted like a "Cash Register". This rugged PDP 11/24 computer had a VT100 screen/keypad with a LA30 printer running the RSX-11M OS with two removable 10 MB RL02 disk drives. RAM memory size - 4MB. We built most of the software ourselves via COBOL, Assembler, & C. All purchases were recorded by our Elevator Managers via the VT-100 and paper copy receipts were given to our farmers. This information was stored during the day to an in-house developed B-tree database on over 500 remote PDPs. In the evening - well past midnight - two Regina based PDP 11/70 computers extracted all the remote elevator/livestock database records. We also had numerous PDP 11/44's handling the auctions at our livestock sites. The 11/70's data was then sent to our IBM mainframe for processing. In Tech Support, “us” DEC techies also maintained RSX-11M by performing OS/Network Gens. This included applying many custom OS extensions needed to support our in house built applications. I also developed an email system so the Elevator Managers could email each other. Worked closely with Network Support and the Operators while being on call during the evenings. I enjoyed teaching RSX and how we applied it. Debugged and installed an open source Virtual Terminal app discovered through DECUS. Also setup up the source/build control system. Wow - it was so exciting working at the Pool after graduation while being part of this incredible Digital Revolution! From May 1984 to September 1985 (1 year 5 months) Regina, Saskatchewan, CanadaStudent Consultant for the Computer Science Department @ Helped students with their programs written in C, Fortran, Basic, APL, Pascal, Assembler, Lisp, Prolog, ADA, etc. and SQL with Ingres and DB2. I also ensured the systems(VAX/VMS, UNIX, CP-5) were operational during those times. Also used Easytrieve/SAS at SGI for their reporting/analysis needs while simulating and modelling their inventory system with SIMSCRIPT. From September 1983 to April 1984 (8 months) Regina, Saskatchewan, CanadaSummer Student for the Alberta Hail Project - Atmospheric Sciences @ Hail - sometimes bigger than golf balls - devastated farmers crops and ruined Albertan's property while sending insurance claims sky high! We developed a real time - big data visual interface - of the cloud formations via S & C band radar systems which determined which clouds had hail. We then sent up specialized aircraft to shoot silver iodine ("Cloud Seeding") into the culprit clouds to reduce the chance that it would hail. For two summers I totally lucked out on being part of this magnificent research project which still fascinates me today. During my summers I had many duties that entailed being a VAX/VMS operator and helping develop a real-time radar data acquisition system and graphics package in Fortran and C. Before we decided to use C we took the language for a test drive by developing a text readability analyzer with the spirit of Natural Language Processing (NLP) in mind. The program was a great success while processing many scientific and technical documents for their readability score. We all learnt a lot about content readability and C while having lots of fun! I also designed a disk cache program for the transfer of radar data from tape. The analysis also applied a Fast Fourier Trans (FFT) approach in Fortran 77. I also participated on some research related to lightning. I would also like to thank my mentors - Mark Johnson & Bill Korendyk - who inspired me to delve into the technical world of Computer Science. I still reminisce about our "techie" talks over a a pitcher of beer at the Lion Heads Pub while trying to develop my "nerves of steel" playing Ms. PACMAN. From May 1982 to August 1983 (1 year 4 months) Edmonton, Canada AreaStudent Consultant/Technical Writer for the Alberta Heritage Trust Fund @ The Heritage Trust Fund has been valued at over 16 billion dollars in 2013 which was funded with the oil revenues Alberta received in the past. Being born and raised in Northern Alberta I have always felt saving this investment for our future generations is paramount. I had the pleasure of documenting the investment procedures used with the computers for the Heritage Trust Fund while studying their investment methods and the Canadian Securities Course. I also hope that more oil revenues are contributed to this fund. From July 1981 to November 1981 (5 months) Edmonton, Canada AreaIEEE CIS CIFEr, Qwafafew, FIXatdl, XBRL, MISMO, RuleML, MIT-CTO @ IEEE/IAFE CIS CIFEr (Computational Intelligence for Financial Engineers) (’95 to Current)
Presented a paper based on my thesis - Stock Market Analysis utilizing Rough Sets - at the first IAFE/IEEE conference in 1995 in NYC while organizing the Poster Session. Was the CIFEr Organizational Chair in 1996 and Technical Chair for Computational Finance from 98 to 01 & 09/10. Was the Program Chair for CIFEr in Nashville('09), Paris('11) and Singapore('13). http://www.ieee-ssci.org and NYC('12) and Honoury Chair London('14) http://www.ieee-cifer.org
XBRL (’07 to Current)
Participated in the startup of XBRL US with the Domain Working Group meetings on the US GAAP/USBRF Modeling (Patterns) doc with IFRS considerations. I brought to the team my expertise on xsd hierarchal taxonomy modeling and extensions while proposing a common business rule dialect with RuleML. Setup a XBRL panel at Cifer’09 in Nashville. Researched a XBRL US consistency checker with a RuleML
FIXatdl (Algorithmic Trading Description Language) (’07 to '11)
Set the standards for the FIXatdl using Xmlspy/Jira with XML/Java/C#. Modeled for low latency/cost execution along the wire. Designed the Algo Trader User Interface, Trade Flow Control, and Rules.
MISMO/MBA (Mortgage Industry Standards Maintenance Organization) (’05 to ’07)
Participated in the startup of the Business Rules Working Group and the Envelope Working Group where we drove out the related standards for WS-* with Addressing, Reliable Messaging, and Security. Aided in a Canonical Model tool evaluatuion.
RuleML (’07 to '11)
Participated in 5 international conferences with RuleML while being on the Program Committee. Was the Track Chair on Cross Industry Standard Business Rules (XBRL, MISMO, FIXatdl, FpML, HL7, Acord ) which included GRC and Corporate Actions.
MIT-CIO 09 Symposium in Boston. Panel Captain for GRC http://www.mitcio.com
Princeton/NYC Qwafafew Panelist/Speaker/Participant
SAS M2009 Data Mining Conference Invited Spreaker From January 1980 to February 1980 (2 months)
MSC, Computer Science - AI/CI & Machine Learning & Data Mining/Rough Sets @ University of Regina From 1990 to 1995 BSC, Computer Science & minor in Accounting @ University of Regina From 1981 to 1984 Computer Science with specialization in Business Admin @ University of Alberta From 1979 to 1981 Robert Golan is skilled in: Artificial Intelligence, Computational..., Knowledge Discovery, SOA, Data Warehousing, Business Intelligence, Data Modeling, Data Mining, XML, Architecture, Web Services, Master Data Management, ETL, UML, SQL, Machine Learning, Data Science, Semantic Web, Data Visualization, Computer Science, Ontologies, XSD, WSDL, Business Rules, Trading Systems, Pharmaceutical Industry, Financial Engineering, Computational Finance, Energy Industry, Enterprise Architecture, Information Architecture, Econometrics, Programming, Inmon, Kimball, OLAP, OLTP, Logical Data Modeling, Relational Data Modeling, XML Schema Design, UML Tools, Data Stewardship, Star Schema, Electronic Trading, Fixed Income, Derivatives, Capital Markets, Data Migration, Perl, Architectures
Looking for a different
Get an email address for anyone on LinkedIn with the ContactOut Chrome extension