libarmnn33t64

Inference engine for CPUs, GPUs and NPUs - shared library

Arm NN is a set of tools that enables machine learning workloads on any hardware. It provides a bridge between existing neural network frameworks and whatever hardware is available and supported. On arm architectures (arm64 and armhf) it utilizes the Arm Compute Library to target Cortex-A CPUs, Mali GPUs and Ethos NPUs as efficiently as possible. On other architectures/hardware it falls back to unoptimised functions.

python3-opt-einsum

Optimized Einsum is a tensor contraction order optimizer

Optimized einsum can significantly reduce the overall execution time of einsum-like expressions by optimizing the expression's contraction order and dispatching many operations to canonical BLAS, cuBLAS, or other specialized routines.

python3-pyspnego

Windows Negotiate Authentication Client and Server

Library to handle SPNEGO (Negotiate, NTLM, Kerberos) and CredSSP authentication. Also includes a packet parser that can be used to decode raw NTLM/SPNEGO/Kerberos tokens into a human readable format.

python3-samsung-mdc

Samsung Multiple Display Control (CLI, Python 3)

Samsung-MDC is an implementation of the Samsung Multiple Display Control Protocol using Python and asyncio.

python3-xypattern

Python library to handle x-y patterns

A simple small library to handle x-y patterns, such as are collected with x-ray diffraction or Raman spectroscopy.

libarmnntfliteparser24t64

Arm NN TensorFlow Lite parser library

Arm NN is a set of tools that enables machine learning workloads on any hardware. It provides a bridge between existing neural network frameworks and whatever hardware is available and supported. On arm architectures (arm64 and armhf) it utilizes the Arm Compute Library to target Cortex-A CPUs, Mali GPUs and Ethos NPUs as efficiently as possible. On other architectures/hardware it falls back to unoptimised functions.