The EdgeCortix Podcast-logo

The EdgeCortix Podcast

Technology Podcasts

EdgeCortix subject matter experts discuss edge AI processors, AI software frameworks, and AI industry trends.

Location:

United States

Description:

EdgeCortix subject matter experts discuss edge AI processors, AI software frameworks, and AI industry trends.

Twitter:

@edgecortix

Language:

English

Contact:

(+81) 44-948-6620


Episodes
Ask host to enable sharing for playback control

AI Drives the Software-Defined Heterogeneous Computing Era

6/28/2023
By Dr. Sakyasingha Dasgupta. The rapid development of artificial intelligence (AI) applications has created enormous demand for high-performance and energy-efficient computing systems. However, traditional homogeneous architectures based on Von Neumann processors face challenges in meeting the requirements of AI workloads, which often involve massive parallelism, large data volumes, and complex computations. Heterogeneous computing architectures integrate different processing units with specialized capabilities and features and have emerged as promising solutions for AI applications. In our view, AI is driving the next era of software-defined heterogeneous computing, enabling better solutions for complex problems. Read the full blog here

Duration:00:08:08

Ask host to enable sharing for playback control

Multimodal Generative AI on Energy-Efficient Edge Processors

6/27/2023
By Dr. Sakyasingha Dasgupta. Edge computing will grow exponentially in the next few years, as more and more devices and applications demand low-latency, high-performance, and privacy-preserving computation at the edge of the network. However, one of the biggest challenges facing edge computing is handling the increasing complexity and diversity of data sources and modalities, such as images, videos, audio, text, speech, and sensors. This challenge is where multimodal generative artificial intelligence (AI) comes into play. Read the full blog here

Duration:00:06:30

Ask host to enable sharing for playback control

再構成可能なアクセラレータを搭載した効率的なエッジAIチップ

6/26/2023
DNA IPとMERAの強力なコンビネーションで、ユーザーがAIアプリケーションを効率的に動かすためのハードウエア固有の特別な知識習得の必要性を減らすことができます。 AI推論の話題の多くは、できるだけ早く多くの処理を提供することに焦点が当てられています。データセンターのように実質的に無制限の電力と冷却を備えた施設では、GPUベースの大きなシステムを実装することが可能です。しかし、データセンター以外の場所(広義では「エッジ」)にある組み込みシステムの制約が加わると、サイズ、重量、電力、利用率を考慮してスケーリングされた、より効率的なエッジAIチップが不可欠となります。EdgeCortix Dynamic Neural Accelerator (DNA) プロセッサアーキテクチャは、カスタムASICやFPGAベースの多くのアプリケーションで、AI推論の高速化ソリューションを提供します。 詳しくは、edgecortix.com でテクノロジーをご確認ください。

Duration:00:10:30

Ask host to enable sharing for playback control

Efficient Edge AI Chips with Reconfigurable Accelerators

6/26/2023
By Nikolay Nez. DNA IP and MERA are a potent combination, reducing the hardware-specific knowledge OEMs would otherwise need to power AI applications efficiently. Much of the AI inference conversation focuses on delivering as many operations as quickly as possible. Massive GPU-based implementations can find homes in data centers with practically unlimited power and cooling. However, add some embedded system constraints found outside the data center. Read the full blog here

Duration:00:07:56

Ask host to enable sharing for playback control

Connecting Edge AI Software with PyTorch, TensorFlow Lite, and ONNX Models

6/23/2023
By Antonio Nevado. Data scientists and others may have concerns moving PyTorch, TensorFlow, and ONNX models to edge AI software applications – MERA makes it easy and is model-agnostic. PyTorch, TensorFlow, and ONNX are familiar tools for many data scientists and AI software developers. These frameworks run models natively on a CPU or accelerated on a GPU, requiring little hardware knowledge. But ask those same folks to move their applications to edge devices, and suddenly knowing more about AI acceleration hardware becomes essential – and perhaps a bit intimidating for the uninitiated. Read the full blog here

Duration:00:06:31

Ask host to enable sharing for playback control

What is edge AI inference doing for more devices?

6/22/2023
By Jeffrey Grosman. AI inference is a common term - but what is edge AI inference? EdgeCortix provides an answer in terms of workloads, efficiency, and applications. Artificial intelligence (AI) is changing the rules for many applications. Teams train AI models to recognize objects or patterns, then run AI inference using those models against incoming data streams. When size, weight, power, and time are of little concern, data center or cloud-based AI inference may do. But in resource-constrained edge devices, different technology is needed. What is edge AI inference doing for more devices? Let’s look at differences in AI inference for the edge and how intellectual property (IP) addresses them. Read the full blog here

Duration:00:07:05