Search Results - jian+meng

3 Results

Sort By:

  1. The success of conventional supervised learning relies on large-scale labeled datasets to achieve high accuracy. However, annotating millions of data samples is labor-intensive and time-consuming. This promotes self-supervised learning as an attractive solution with artificial labels being used instead of human-annotated ones for training. Contrastive...
    Published: 2/13/2025
  2. Deep neural networks (DNNs) have shown extraordinary performance in recent years for various applications, including image classification, object detection, speech recognition, etc. Accuracy-driven DNN architectures tend to increase model sizes and computations in a very fast pace, demanding a massive amount of hardware resources. Frequent communication...
    Published: 2/13/2025
  3. ­Background Deep neural networks (DNNs) have been very successful in large-scale recognition tasks, but they exhibit large computation and memory requirements. To address the memory bottleneck of digital DNN hardware accelerators, in-memory computing (IMC) designs have been presented to perform analog DNN computations inside the memory. Recent IMC...
    Published: 2/13/2025

Search Inventions

Looking for a technology or invention to commercialize? Arizona State University has more than 300 technologies available for licensing. Start your search here or submit your own invention.