NVDLA Engine vs Neural Network Engine

Dear StarFive Support,
I would like to know why there’s co-existing of NVDLA Engine and Neural Network Engine in the JH7100 on VisionFive V1 board, please ? Could you also give some examples on their use-cases, please?

Thanks in advance and best regards,
Khang.

1 Like

NVDLA is our attempt to use open source AI core, NNE is the engine we developed in house. We enabled both hardware capacities aiming to give open source players more choices.

Since NVDLA is open sourced, toolchains and guidelines could be found and followed from NVIDIA.
Our current NNE SDK has not been released yet and is still under optimization.

Thanks @Admin,

Meanwhile in NVDLA, you said that the NVDLA was not quite ready yet. I deduce that currently there’s no possibility to develop AI/Deep Learning based applications making use either NVDLA or NN Engine available in the JH7100 on the VisionFive V1 board, right ? Could you also share the plan or roadmap to support them in future, please ?

Best Regards,
Khang.

I agree with you. NVDLA and NNE hardware are ready in VisionFive, but not the software support.

About the plan, we are in the process of optimizing the software support for NNE. And we also plan to run NVDLA in our future SoCs. Further annoucement will be made in RVspace as soon as we have more progress.

1 Like

Yes, the documentation for NNE has not been released yet, because the performance is still being optimized.

Are you have any news ?

There is still no good news. Now R&D is focusing on the development of VF2 and cannot spare human resources to do this.

To your knowledge, are there plans to continue development for the NNE and NVDLA for the VF1 after the VF2 is released? If not, is there a github repo or similar that has the work you have done so far?

A good compromise in my mind would just be release of the datasheet so in the interim the community can at least tinker. That said, at least NVDLA is public stuff so with the memory map in the JH7100 data sheet it should be possible to use that today.