Deep Learning for Signals
Deep learning is increasingly being incorporated into applications involving signals/time-series data such as voice assistants, digital health, and radar and wireless communications.
In this video, you will learn how you can leverage techniques, such as time-frequency transformations and wavelet scattering networks, and use them in conjunction with convolutional neural networks and recurrent neural networks to build predictive models on signals.
You will also see how MATLAB® can help you with the four steps typically involved in building such applications:
- Accessing and managing signal data from a variety of hardware devices
- Performing deep learning on signals through time-frequency representations or deep networks
- Training deep networks on single or multiple NVIDIA® GPUs on local machines or cloud-based systems
- Generating optimized CUDA® code for your signal preprocessing algorithms and deep networks
Get a free trial: https://bit.ly/34asIUl
Deep learning for signal overview: https://bit.ly/34dKg1J
Signal labeling documentation: https://bit.ly/2PuhDt9
MATLAB and Simulink for signal processing: https://bit.ly/36d1XAm
No comments