Software architect with a proven track record of designing, implementing and deploying data transport and storage systems that operate at the largest of internet scale.
Senior Staff Software Engineer @ Architect with LinkedIn's Distributed Data Systems team developing ESPRESSO - a data store that bridges the gap between simple key-value stores and full fledged RDMBSs.
ESPRESSO is an elastic, horizontally-scalable, document-oriented, distributed data store with limited secondary indexing. Unique features of ESPRESSO include:
* support for transactional updates to closely related data items
* generation of an after-image changelog for downstream processing
* local secondary index support From January 2011 to Present (5 years) Software Architect, Sr Principal @ Senior Principal Architect for Yahoo! Mail.
January 2010 - date: Yahoo! Mail
Principal Architect, Sr.
Architect for a team redesigning how message metadata is stored and retrieved within Yahoo! Mail. The goal of the project is to move away storing metadata in flat files on expensive storage appliances to a more cloud-like architecture built on commodity hardware shared with Mail Search. Unlike the current design that hits quantization limits imposed by the size and IO capacity of a single storage appliance, the new design is horizontally scalable to data center wide deployments and eliminates the duplication of metadata for over 1 billion mailboxes in both a search subsystem and a mailbox subsystem. My contributions include a design for lock-less reads that permits concurrent access to users’ mailboxes from multiple clients (e.g. web and mobile) while allowing simultaneous delivery of new mail. The design eliminates the primary source of mailbox unavailable errors to our end users. From January 2010 to January 2011 (1 year 1 month) Architect @ I was solely responsible for the design - and provided technical leadership for the development team that built - Yahoo!'s Updates platform. Yahoo! Updates syndicates digests of user generated content and activity. The system aggregates data on behalf of over 80 million users and displays recent activity to related users on Front Page, Mail, Messenger, Pulse and a numerous Yahoo! properties. The system serves an average of just under 8000 API requests per second world-wide with an average latency of less than 100 msec.
As architect for the Updates platform, I worked closely with our Cloud storage team to define feature of Yahoo!'s next generation distributed storage infrastructure. As an early adopter, we moved the entire Updates data store. Twice. Both data migrations were performed on live systems without data loss or discontinuity of service. I was responsible for the migration strategy. From January 2007 to December 2009 (3 years) Sr. Software Engineer @ As a member of Yahoo!"s "Tiger Team" I spent 9 months on the Low Cost Content Match project (similar to Google Adsense). I was responsible for a component that denormalized components of advertiser bid data and stored it in an in-memory data store for fast access during content match. I also worked on changes to the data store to facilitate internationalization (i18n) of the terms and creatives From March 2006 to January 2007 (11 months) Technical Yahoo! @ I designed and build the back-end data store for Yahoo!'s foray into social networking. I was also responsible for most of the integrations with other Yahoo! properties. When Yahoo! announced end-of-life for Yahoo! 360 in the fall of 2008, the system was serving 1.2 billion page views to 12 million unique users each month. From December 2003 to March 2006 (2 years 4 months) Technical Yahoo! @ I was responsible for proxyio: the dominant data transport infrastructure in use at Yahoo!. Proxyio is a dynamically reconfigurable, fault tolerant, distributed, connection caching messaging system. Every login event, email message, instant message, and advertisement delivered on the Yahoo! network relies on the system. From January 1999 to December 2003 (5 years) Member, Technical Staff @ Member of a small team responsible for SGI's Ada compiler and tools. I was primarily responsible for the run time components that implemented the interface between the GNU Ada compiler and the Irix operating system. Was the primary maintainer of an Irix pthread library used by the tasking run time system. Also served as build master and the primary interface with several large customers. From 1994 to 1999 (5 years) Systems Engineer @ Engineer working on a real time embedded component of the AN/USQ-101(V) Tactical Data Information Exchange Subsystem Broadcast (TADIXS B) Tactical Receive Equipment (TRE). Held a Secret clearance. From 1992 to 1994 (2 years) Runtime Guy @ Responsible for TeleSoft's Ada tasking run time system. Member of the AJPO User-Implementer team prototyping proposed language changes for the Ada9X ANSI/ISO language revision. Author of several papers published in ACM Ada Letters and conference proceedings. From 1983 to 1992 (9 years)
B.S. Computer Science, Minor in Electrical Engineering @ San Diego State University-California State University From 1980 to 1985 Tom Quiggle is skilled in: Distributed Systems, Scalability, Perl, Software Engineering, REST, C++, TCP/IP, Linux, C, Architectures, Integration, Multithreading, Scrum, Architecture, Ada programming, Operating Systems