database - Inserting a Matlab Float Array into postgresql float[] column -


i using jdbc access postgresql database through matlab, , have gotten hung when trying insert array of values rather store array instead of individual values. matlab code i'm using follows:

insertcommand = 'insert neuron (classifier_id, threshold, weights, neuron_num) values     (?,?,?,?)'; statementobject = dbhandle.preparestatement(insertcommand); statementobject.setobject(1,1); statementobject.setobject(2,output_thresholds(1)); statementobject.setarray(3,dbhandle.createarrayof('"float8"',outputnodes(1,:))); statementobject.setobject(4,1);  statementobject.execute; close(statementobject); 

everything functions except line dealing arrays. object outputnodes <5x23> double matrix, i'm attempting put first <1x23> table.

i've tried several different combinations of names , quotes '"float8"' part of createarrayof call, error:

??? java exception occurred: org.postgresql.util.psqlexception: unable find server array type provided name     "float8". @ org.postgresql.jdbc4.abstractjdbc4connection.createarrayof(abstractjdbc4connection.java:82) @ org.postgresql.jdbc4.jdbc4connection.createarrayof(jdbc4connection.java:19)  error in ==> databasetest @ 22 statementobject.setarray(3,dbhandle.createarrayof('"float8"',outputnodes(1,:))); 

performance of jdbc connector arrays

i'd note in case have export rather big volumes of data containing arrays jdbc may not best choice. firstly, performance degrades due overhead caused conversion of native matlab arrays org.postgresql.jdbc.pgarray objects. secondly, may lead shortage of java heap memory (and increasing java heap memory size may not panacea). both these points can seen on following picture illustrating performance of datainsert method matlab database toolbox (it works postgresql through direct jdbc connection):

performance arrays

the blue graph displays performance of batchparamexec command pgmex library (see https://pgmex.alliedtesting.com/#batchparamexec details). endpoint of red graph corresponds maximum data volume passed database datainsert without error. data volume greater maximum causes “out of java heap memory” problem (java heap size specified @ top of figure). further details of experiments please see following paper full benchmarking results data insertion.

example reworked

as can seen pgmex based on libpq (the official c application programmer's interface postgresql) has greater performance , able process volumes @ least more 2gb. using library code can rewritten follows (we assume below parameters marked <> signs filled, table neuron exists in database , have fields classifier_id of int4, threshold of float8, weights of float8[] , neuron_num of int4 and, @ last, variables classfieridvec, output_thresholds, outputnodes , neuronnumvec defined , numerical arrays of sizes shown in comments in code below; in case types of table fields different need appropriately fix last command of code):

% create database connection dbconn = com.allied.pgmex.pgmexec('connect',[...     'host=<yourhost> dbname=<yourdb> port=<yourport> '...     'user=<your_postgres_username> password=<your_postgres_password>']);  insertcommand = ['insert neuron '...     '(classifier_id, threshold, weights, neuron_num) values ($1,$2,$3,$4)']; sdata = struct(); sdata.classifier_id = classifieridvec(:); % [ntuples x 1] sdata.threshold = output_thresholds(:); % [ntuples x 1] sdata.weights = outputnodes; % [ntuples x nweights] sdata.neuron_num = neuronnumvec; % [ntuples x 1] com.allied.pgmex.pgmexec('batchparamexec',dbconn,insertcommand,...      '%int4 %float8 %float8[] %int4',sdata); 

it should noted outputnodes needs not cut along rows on separate arrays because latter ones of same length. in case of arrays different tuples having different sizes necessary pass them column cell array each cell containing own array each tuple.


Comments

Popular posts from this blog

Detect support for Shoutcast ICY MP3 without navigator.userAgent in Firefox? -

web - SVG not rendering properly in Firefox -

java - JavaFX 2 slider labelFormatter not being used -