c++ - Boost python: passing large data structure to python -
i'm embedding python in c++ program using boost/python in order use matplotlib. i'm stuck @ point have construct large data structure, let's dense 10000x10000 matrix of doubles. want plot columns of matrix , figured have multiple options so:
- iterating , copying every value numpy array --> don't want obvious reason doubled memory consumption
- iterating , exporting every value file importing in python --> without boost/python , don't think nice way
- allocate , store matrix in python , update values c++ --> stated here it's not idea switch , forth between python interpreter , c++ program
- somehow expose matrix python without having copy --> can find on matter extending python c++ classes , not embedding
which of these best option concerning performance , of course memory consumption or there better way of doing kind of task.
to prevent copying in boost.python, 1 can either:
- use policies return internal references
- allocate on free store , use policies have python manage object
- allocate python object extract reference array within c++
- use smart pointer share ownership between c++ , python
if matrix has c-style contiguous memory layout, consider using numpy c-api. pyarray_simplenewfromdata()
function can used create ndarray object thats wraps memory has been allocated elsewhere. allow 1 expose data python without requiring copying or transferring each element between languages. how extend documentation great resource dealing numpy c-api:
sometimes, want wrap memory allocated elsewhere ndarray object downstream use. routine makes straightforward that. [...] new reference ndarray returned, ndarray not own data. when ndarray deallocated, pointer not freed.
[...] if want memory freed ndarray deallocated set
owndata
flag on returned ndarray.
also, while plotting function may create copies of array, can within c-api, allowing take advantage of memory layout.
if performance concern, may worth considering plotting itself:
Comments
Post a Comment