Energy Efficient Neural Architectures for TinyML Applications

Authors

DOI:

https://doi.org/10.38124/ijsrmt.v4i5.531

Keywords:

TinyML, Energy-Efficient Neural Networks, Edge Computing, Model Compression Techniques, Neural Architecture Optimization

Abstract

There is now a shift being made in machine learning because of Tiny Machine Learning (TinyML) and its use on microcontrollers and edge sensors. This article investigates energy-efficient neural network designs for TinyML that are built to strike a balance among accuracy, how much memory is used and power consumption. We look at recent developments in model quantization, pruning and neural architecture search (NAS) that support using deep learning models in very energy efficient devices. The practical uses of MobileNet, SqueezeNet and EfficientNet on devices that have edge hardware are considered, along with how well they can preserve overall accuracy. Evaluations of minimizing energy DRAM by codesigning hardware and software, along with using specialized accelerators, are considered. Since real-time decisions matter a lot in environmental monitoring, wearable technology and industrial IoT, it’s clear that model deployment must be both efficient and dependable. It gives an overview of the most recent findings to demonstrate how energy-efficient architecture contributes to the fast ongoing progress of TinyML in many areas. Focusing on hands-on methods and actual use cases, this discussion gives actionable tips to those wanting to design smart and energy-efficient edge systems.

Downloads

Download data is not yet available.

Downloads

Published

2025-05-29

How to Cite

Faheem, M. (2025). Energy Efficient Neural Architectures for TinyML Applications. International Journal of Scientific Research and Modern Technology, 4(5), 45–50. https://doi.org/10.38124/ijsrmt.v4i5.531

PlumX Metrics takes 2–4 working days to display the details. As the paper receives citations, PlumX Metrics will update accordingly.

Similar Articles

1 2 3 4 5 6 7 8 9 > >> 

You may also start an advanced similarity search for this article.