Goal: To empower people or organizations to gain actionable insights on large datasets (over petabytes in size) by building big data analytics infrastructure. The ideal workplace that can leverage my full potential is where the people recognizes the added values of big data technologies and machine learning for creating new opportunities. Specialties: distributed systems, high performance computing,...
Goal: To empower people or organizations to gain actionable insights on large datasets (over petabytes in size) by building big data analytics infrastructure. The ideal workplace that can leverage my full potential is where the people recognizes the added values of big data technologies and machine learning for creating new opportunities. Specialties: distributed systems, high performance computing, in memory data grid, machine learning (Deep Learning), big data technologies (HDFS, Hadoop MapReduce/Spark, Pig, Oozie and HBase)Senior Data Engineer @ From June 2015 to Present (4 months) Senior Software Developer - Big Data @ - Develop technologies using Hadoop to disseminate, analyze and store data (a.k.a. ETL) from ICGC (International Cancer Genome Consortium) Data Coordination Centre (http://dcc.icgc.org/) - Architect and implement a data Storage service (micro-service) for Genomics Research based on Ceph, Amazon S3 and Spring Boot * Technology Used: Pig, Hadoop MapReduce&HDFS, Oozie, Amazon EC2 From January 2013 to June 2015 (2 years 6 months) Senior C++/Java Developer - Big Data @ Prototyped new data backup and data protection software using Hadoop infrastructure. - Innovate an efficient serialization format in the common IO package of Hadoop for online backup cloud storage - Design and implement map reduce jobs using Hadoop for online backup cloud storage * Technology Used: Hadoop HDFS&MapReduce, HBase, Protocol Buffer. From June 2012 to January 2013 (8 months) Oakville, OntarioAssistant Vice President - Senior Application Development Programmer Analyst @ - Architected and developed messaging systems using HBase and Hadoop for global credit markets to process 2 millions messages daily - Designed schema for column-oriented stores to improve performance specific to the business’ data access patterns - Implemented cross language and data store agnostic client API using Thrift to help users to access data in HBase without the need of knowing the details about the underlying technology - Tuned HBase performance and maintained its performance/stability by developing repair tools to better manage hbase’s operations in a real-world production environment - Benchmarked business specific usage scenarios by extending the Yahoo Cloud Serving Benchmark and developing JMX monitoring API for performance tuning - Led the implementation of continuous integration with the goal to deliver high quality software by developing testing frameworks across the entire stack of software for test automation * Technology Used: Hadoop HDFS&MapReduce, HBase, Protocol Buffer, Thrift From April 2011 to June 2012 (1 year 3 months) Mississauga, OntarioSoftware Consultant @ Develop software solutions using high performance computing and clustering technologies for customers in the field of telecommunication, insurance and financial services. One of my original implementations: http://www.gigaspaces.com/files/excelthatscales.pdf * Technology Used: GigaSpaces, MPI, C#, C++ From January 2008 to June 2010 (2 years 6 months) Hong KongVisiting Scholar & Part Time Research Assistant @ - Performed time of arrival (TOA) statistical modeling for wireless communication propagation channels - Researched novel machine learning algorithms to fine-tune computer vision systems - Developed array signal processing algorithms based on discrete fourier transform to estimate direction of arrival in sensor arrays - Investigated new methodologies for denoising ultrasound images using wavelet and filters From October 2007 to May 2010 (2 years 8 months) Hong KongResearch Assistant @ Research on novel eye detection algorithms using deep learning techniques (Convolutional Neural Networks) for a real-time medical device and more specifically for a remote gaze estimation system. From September 2005 to September 2007 (2 years 1 month) Toronto, Canada AreaSoftware Engineer @ Develop network demographics reporting software solutions to provide insights for broadband providers about their subscriber's online behavior. From 2002 to August 2005 (3 years) Waterloo, Ontario, CanadaMaster of Applied Science, Electrical and Computer Engineering, 4.0 @ University of Toronto From 2005 to 2007 Bachelor of Applied Science, Computer Engineering, 3.88 @ University of Waterloo From 1999 to 2004 Health Science Diploma, Health Science @ Vanier College From 1997 to 1999 High School Diploma @ Ecole Secondaire LeMoyne d'Iberville From 1993 to 1997 Master of Health Informatics, Health/Health Care Administration/Management @ University of Toronto From 2010 to 2011 Jerry Lam is skilled in: Distributed Systems, Hadoop, Machine Learning, Java, Software Development, High Performance..., C++, Design Patterns, Concurrent Programming, Scalability, Grid Computing, Cloud Computing, MapReduce, HBase, Open Source, Parallel Computing, Healthcare, Cluster, C#, Python, Software Engineering, MPI, Neural Networks, Application Development, Linux, Computer Vision, Low Latency, Awesomeness, ElasticSearch, Apache Pig, Spark, Pattern Recognition, Image Processing, Caching, Oracle Coherence, Signal Processing, Clustering, Deep Learning
Senior Data Engineer
June 2015 to Present
Ontario Institute for Cancer Research
Senior Software Developer - Big Data
January 2013 to June 2015
Senior C++/Java Developer - Big Data
June 2012 to January 2013
Assistant Vice President - Senior Application Development Programmer Analyst
April 2011 to June 2012
Cluster Technology Ltd.
January 2008 to June 2010
Hong Kong Polytechnic University
Visiting Scholar & Part Time Research Assistant
October 2007 to May 2010
University of Toronto
September 2005 to September 2007
Toronto, Canada Area
2002 to August 2005
Waterloo, Ontario, Canada
University of Toronto
Master of Applied Science Electrical and Computer Engineering 4.0
2005 to 2007
University of Waterloo
Bachelor of Applied Science Computer Engineering 3.88
1999 to 2004
Health Science Diploma Health Science
1997 to 1999
Ecole Secondaire LeMoyne d'Iberville
High School Diploma
1993 to 1997
University of Toronto
Master of Health Informatics Health/Health Care Administration/Management
2010 to 2011
What company does Jerry Lam work for?
Jerry Lam works for Canopy Labs
What is Jerry Lam's role at Canopy Labs?
Jerry Lam is Senior Data Engineer
What industry does Jerry Lam work in?
Jerry Lam works in the Research industry.
Enjoy unlimited access and discover candidates outside of LinkedIn
One billion email addresses and counting
Everything you need to engage with more prospects.
ContactOut is used by
76% of Fortune 500 companies