Harmony in Complexity: Unveiling Mathematical Unity Across Logistic Regression, Artificial Neural Networks, and Computer Vision

Abstract: 

This presentation embarks on an exploration of the intricate interconnections that bind logistic regression, neural networks, and computer vision, unveiling their shared foundational principles through the lens of linear algebra. The main focus of this exploration is to highlight how abstract mathematical concepts play a crucial role in shaping and bringing together these different methodologies.

By drawing meaningful parallels between the construction of logistic regression functions and their mathematical representations, we create a path to understanding the intrinsic relationship between these two entities. In logistic regression, the linear function, dynamically molded by a combination of various features, emerges as a visual metaphor—a plane in the mathematical fabric. This illustration sets the stage for the intricate processes happening in neural networks.

In the realm of neural networks, the combination of weights and nodes takes center stage as space surrounded by multi-dimensional planes. The alignment of these planes with linear algebra principles becomes apparent, highlighting the basic math that shapes how neural networks work. Despite their outward dissimilarity, an underlying mathematical structure binds these models together, with the singular differentiator residing in the activation function. Logistic regression leans on the sigmoid function, while neural networks embrace the ReLU function, showcasing the versatile adaptability of these mathematical tools.

The widespread use of the ReLU activation function in neural networks and convolutional neural networks (CNNs) reveals a shared common mathematical foundation. CNNs are widely employed in computer vision algorithms. This consistency across architectures underscores the universality of the principles derived from linear algebra. Transitioning into the realm of computer vision, we explore the application of filters as weighted combinations of pixel features. This extends the linear algebraic concept to image processing, demonstrating the versatility and applicability of these mathematical principles across diverse domains.

In essence, this presentation seeks to illuminate the profound harmony and shared essence of mathematical principles that transcend traditional disciplinary boundaries. It underscores the unifying influence of linear algebra in unraveling the core relationships defining the evolution of machine learning and computer vision paradigms, providing a holistic perspective for researchers.

Bio: 

Dr. Liliang Chen is a seasoned financial analytics manager with over 13 years of experience in the financial industry. He has made significant contributions to renowned Fortune 50 companies, including AIG, Fannie Mae, and Freddie Mac. Driven by a passion for exploring the intersection of AI and machine learning with finance, he actively seeks opportunities to innovate and disrupt the industry. Dr. Chen possesses extensive expertise in leveraging advanced technologies such as deep learning, neural networks, and natural language processing (NLP) to address real business challenges. He earned his Ph.D. in Engineering from the Catholic University of America.

Open Data Science

 

 

 

Open Data Science
One Broadway
Cambridge, MA 02142
info@odsc.com

Privacy Settings
We use cookies to enhance your experience while using our website. If you are using our Services via a browser you can restrict, block or remove cookies through your web browser settings. We also use content and scripts from third parties that may use tracking technologies. You can selectively provide your consent below to allow such third party embeds. For complete information about the cookies we use, data we collect and how we process them, please check our Privacy Policy
Youtube
Consent to display content from - Youtube
Vimeo
Consent to display content from - Vimeo
Google Maps
Consent to display content from - Google