Tensor and Sample
The runtime data model has two primary types:
Tensor: typed numeric payload (shape, dtype, layout, storage, device).Sample: envelope around tensor payload with media/runtime metadata.
Tensor
Use Tensor when you only need numeric data and shape/type semantics.
Typical metadata in tensor:
- shape / dtype / layout
- storage and mapping behavior
- optional image/audio/encoded semantic tags
Related references:
NumPy and PyTorch interop
For Python developers familiar with NumPy and PyTorch, Tensor supports DLPack-based interop:
Tensor.from_numpy(...)Tensor.to_numpy(...)Tensor.from_torch(...)Tensor.to_torch(...)Tensor.from_dlpack(...)Tensor.__dlpack__()
This keeps interop paths explicit and enables zero-copy where backend data layout permits it.
import numpy as np
import pyneat as neat
# HWC uint8 image-like tensor
arr = np.random.randint(0, 255, (224, 224, 3), dtype=np.uint8)
t = neat.Tensor.from_numpy(arr, copy=False, image_format=neat.PixelFormat.RGB)
arr_back = t.to_numpy(copy=False)
# If you already have a model, NumPy can be passed directly.
# out = model.run(arr, timeout_ms=2000)
Sample
Use Sample when you need pipeline metadata in addition to tensor bytes.
Typical sample fields:
caps_string,media_type,payload_tagpts_ns,dts_ns,duration_nsstream_id,frame_id,port_namefieldsfor bundle outputs
Related references:
#include "neat/session.h"
#include "neat/nodes.h"
simaai::neat::Session session;
session.add(simaai::neat::nodes::Input({}));
session.add(simaai::neat::nodes::Output({}));
auto run = session.build(simaai::neat::Sample{}, simaai::neat::RunMode::Async);
simaai::neat::Sample in;
in.kind = simaai::neat::SampleKind::Tensor;
in.stream_id = "cam-0";
in.frame_id = 42;
// in.tensor = ...;
run.push(in);
auto out = run.pull(1000);
if (out) {
std::cout << "stream=" << out->stream_id
<< " frame=" << out->frame_id
<< " media_type=" << out->media_type << "\n";
}
Runtime handle
Both types flow through Run (push, pull, push_and_pull, run).