Search Results - jae-sun+seo

13 Results

Sort By:

  1. The success of conventional supervised learning relies on large-scale labeled datasets to achieve high accuracy. However, annotating millions of data samples is labor-intensive and time-consuming. This promotes self-supervised learning as an attractive solution with artificial labels being used instead of human-annotated ones for training. Contrastive...
    Published: 2/13/2025
  2. Deep neural networks (DNNs) have shown extraordinary performance in recent years for various applications, including image classification, object detection, speech recognition, etc. Accuracy-driven DNN architectures tend to increase model sizes and computations in a very fast pace, demanding a massive amount of hardware resources. Frequent communication...
    Published: 2/13/2025
  3. ­Background Recurrent neural networks (RNNs) that enable accurate automatic speech recognition (ASR) are large in size and have long short-term memory (LSTM) capabilities. Due to the large size of these networks, most speech recognition tasks are performed in the cloud servers, which requires constant internet connection, introduces privacy concerns,...
    Published: 2/13/2025
    Keywords(s):  
  4. ­Recently, Deep Neural Networks (DNNs) have been deployed in many safety-critical applications. The security of DNN models can be compromised by adversarial input examples, where the adversary maliciously crafts and adds input noise to fool a DNN model. The perturbation of model parameters (e.g., weight) is another security concern, one that relates...
    Published: 2/13/2025
  5. Background Deep neural networks, and in particular convolutional neural networks, are being used with increasing frequency for a number of tasks such as image classification, image clustering, and object recognition. In a forward propagation of a conventional convolutional neural network, a kernel is passed over one or more tensors to produce one or...
    Published: 2/13/2025
    Keywords(s):  
  6. ­Background Deep neural networks (DNNs) have been very successful in large-scale recognition tasks, but they exhibit large computation and memory requirements. To address the memory bottleneck of digital DNN hardware accelerators, in-memory computing (IMC) designs have been presented to perform analog DNN computations inside the memory. Recent IMC...
    Published: 2/13/2025
  7. ­Background In the era of artificial intelligence, various deep neural networks (DNNs), such as multi-layer perceptron, convolutional neural networks, and recurrent neural networks, have emerged and achieved human-level performance in many recognition tasks. These DNNs usually require billions of multiply-and-accumulate (MAC) operations, soliciting...
    Published: 2/13/2025
  8. Background Traditional hardware designs for device authentication and secret key generation typically employ physical unclonable functions (PUFs) which generate unique random numbers based on static random-access memory (SRAM), delay, or analog circuit elements. Although silicon PUFs can be highly stable and unique, they do not represent liveliness....
    Published: 2/13/2025
    Keywords(s):  
  9. Background Neuromorphic computing, or the concept of designing systems that mimic the adaptability and learning exhibited by biological neural processes, is the future of computing design. This type of computing is ideal for use in the Internet of Things (IoT) movement, which refers to the embedding of internet computing devices into everyday objects....
    Published: 2/13/2025
    Inventor(s): Jae-Sun Seo, Shimeng Yu
    Keywords(s):  
  10. Recent breakthroughs in deep neural networks (DNNs) have led to improvements in state-of-the-art speech applications. Conventional DNNs have hundreds or thousands of neurons in each layer, which require a large amount of memory to store the connections between neurons. Implementing these networks in hardware requires a large memory and high computation...
    Published: 2/13/2025
    Keywords(s):  

Search Inventions

Looking for a technology or invention to commercialize? Arizona State University has more than 300 technologies available for licensing. Start your search here or submit your own invention.