
MACsec IP Core from Xiphera
BittWare Partner IP MACsec IP Core IEEE 802.1AE IP Core The Xiphera MACsec family provides high-speed IP cores implementing the MACsec (Media Access Control security)
Working alongside CPUs, FPGAs provide part of a heterogeneous approach to computing. For certain workloads, FPGAs provide significant speedup versus CPU—in this case 50x faster for machine learning inference.
FPGAs have a range of tools to best tailor to the application. The hardware fabric adapts to use only what’s needed, including hardened floating-point blocks when required. For BWNN’s weights, we used only a single bit, plus mean scaling factor, and still achieved acceptable accuracy but saving significant resources.
Power per watt is not only important at the edge, it’s in the power budget of datacenters in both space and cost of power. FPGAs can uniquely deliver the latest efficient libraries yet at far lower power per watt than CPUs.
With BittWare’s exclusive optimized OpenCL BSP, you’re able to both tap into software-orientated developers and the latest software libraries. This allowed us to quickly adapt the YOLOv3 framework, which has improved performance over older ML libraries.
We target applications when demand to process storage outpaces traditional architectures featuring CPUs.
FPGAs allow customers to create application-specific hardware implementations that exhibit the following properties:
Get answers to your HPC questions from our technical staff.
"*" indicates required fields
BittWare Partner IP MACsec IP Core IEEE 802.1AE IP Core The Xiphera MACsec family provides high-speed IP cores implementing the MACsec (Media Access Control security)
White Paper FPGA Acceleration of Binary Weighted Neural Network Inference One of the features of YOLOv3 is multiple-object recognition in a single image. We used
ARCHITECTURAL CONCEPTS NVMe High-Speed Data Capture and Recorder The Data Capture and Recording Challenge There are many streaming data sources customers want to capture or
Article FPGA Neural Networks The inference of neural networks on FPGA devices Introduction The ever-increasing connectivity in the world is generating ever-increasing levels of data.