0
Performance between Windows and Linux

Hello All,

I am very surprised that there is 10 times difference between performance of Windows and Linux versions of Scilab for a particular script.

The script is here: https://pastebin.com/6FVL5K27

The difference between Win and Linux is that Linux is 10 times faster, so the script takes dozen of seconds on Linux and over few minutes on Windows.

Has anyone observed such a problem and know why there is such difference?


Scilab 14-06-19, 8:57 p.m. pbies
0

[Note: please paste your scilab script here for ease of answering your question]

It is difficult to answer this outright. Here are some possibilities:

1. As your script has many plot commands, it is possible that the graphical interface to show the plot may be taking more time. When comparing on two platforms, please avoid graphical commands (like plot).

2. You may print the time difference at several points in the script. This will help you in identifying which part of the program is slow.

3. Please check if all other programs have been stopped. Any program actively running in the background can cause scilab to slow down.

As such, there should not be a significant difference between windows and linux.

14-08-19, 12:04 p.m. sunilshetye
It is not case of: 1. plot commands 2. yes, I've used tic & toc and this does not change the time of the script execution 3. yes, it is clear system with only Scilab.

There is huge difference between Linux and Windows versions and it is very strange.

Before posting please try the script on both systems. Thanks.

14-08-19, 7:38 p.m. pbies
I have downloaded the toolbox for linux and found out that the toolbox was not getting loaded properly. Because of the error, it appeared to run faster on linux. This problem is the filenames are case-sensitive in linux. These were the commands run for linux:

cd contrib/ANN_Toolbox/0.5/etc
mv ANN_toolbox.start ANN_Toolbox.start
mv ANN_toolbox.quit ANN_Toolbox.quit
sed -i 's/ANN_toolbox.start/ANN_Toolbox.start/' ANN_Toolbox.start

After these fixes, restart scilab. Load the ANN toolbox again. Check that there are no errors while loading now. The script should now take the same time as it takes on Windows.

16-08-19, 3:07 p.m. sunilshetye

I load libraries once and it is all ok. The problem is directly with computation - on Windows it is several times longer than on Linux with the same version of Scilab.

Have you verified the script yourself?


22-04-20, 8:13 p.m. pbies

The script is:

// clear variables
clear

// clear console
clc

// start the same randomness
rand('seed',0);

// track time elapsed
tic();

// data
P = [15    19    20    45    49    7    8
8    12    13    39    44    4    9
6    9    31    43    44    6    9
2    30    34    35    45    1    2
3    17    31    34    40    1    2
14    20    23    39    49    4    10
10    19    24    30    39    2    4
3    12    24    37    38    3    7
2    3    30    31    45    6    8
8    14    23    30    45    1    9
25    31    38    49    50    5    10
12    20    21    22    35    4    10
7    12    28    34    45    3    6
6    27    30    35    41    4    5
4    14    25    34    49    4    9
1    23    32    45    49    5    10
5    12    20    29    48    7    9
1    7    12    23    39    3    4
7    16    22    36    44    3    4
2    6    30    32    49    1    4
2    13    39    45    47    4    6
12    22    24    29    38    5    6
15    19    35    36    41    5    10
1    17    29    39    42    7    8
9    14    28    30    37    3    10
13    19    23    34    41    3    8
3    21    26    40    41    8    10];

// define patterns
k=6; // No. of pattern points
c=3;  // No. of prediction points

// learning parameters lp=[0.1 0.05 0.5 0.1];
// ranges: 0.1-1 0-0.1 0-0.9999 0-0.25
lp=[0.2 0.05 0.5 0];

// T = epochs; 12000 for run; 200/500 for testing
T=200;

// data size
[rowsdata,colsdata]=size(P);

// normalize the learning data between [mn1,mx1] - in relation with r below
mx=max(P);
mn=min(P);
mx1=1;
mn1=-1;
P=(P-mn)/(mx-mn)*(mx1-mn1)+mn1;

// change absolute values to relative values
for w=2:rowsdata
    S(w-1,:)=(P(w,:)-P(w-1,:))./(P(w-1,:))
end
// ?
X=S; // (1:rowsdata-1,:);
// fix
X=X+0.5;

// data rows and cols count
[rows,cols]=size(X);

// learning series, pair <x,t>, automatic conversion of X which is a column
x=[];
t=[];

for a=1:cols
    for i=1:(rows-k-c)
        x=[x X(i:i+k-1,a)];
        t=[t X(i+k:i+k+(c-1),a)];
    end
end

// NN structure
[in_count, pattern_count]=size(x);
[out_count, pattern_count]=size(t);

// number of neurons in layers
N=[in_count 11 out_count];
// limit, default [-1,1]
r=[-1,1];

// initialize the weight hypermatrix (without bias).
// N - Row vector describing the number of neurons per layer. N(1) is the size of input pattern vector, N(size(N,'c')) is the size of output pattern vector (and also target).
// r - Two component row vector defining the smallest and the largest value for initialization. Weights will be initialized with random numbers between these two values.
// r(1) the lower limit
// r(2) the upper limit
// This parameter is optional, default value is [-1,1].
// W - The weight hypermatrix, in the format used by ann_BP_Std_nb, ann_BP_run_nb and other functions working with feedforward nets (without bias).
W=ann_FF_init_nb(N, r);

// learning = online backpropagation with momentum.
// x = Matrix of input patterns, one pattern per column.
// t = Matrix of targets, one pattern per column. Each column have a correspondent column in x.
// N = Row vector describing the number of neurons per layer. N(1) is the size of input pattern vector, N(size(N,'c')) is the size of output pattern vector (and also target).
// W = The weight hypermatrix (initialized first with ann_FF_init_nb).
// lp = Learning parameters
// T = the number of epochs (training cycles trough all pattern set.
[W,Delta_W_old]=ann_FF_Mom_online_nb(x,t,N,W,lp,T);

// run patterns trough a feedforward net (without bias).
// y (result) - Matrix of outputs, one pattern per column. Each column have a correspondent column in x.
// x - Matrix of input patterns, one pattern per column.
// N - Row vector describing the number of neurons per layer. N(1) is the size of input pattern vector, N(size(N,'c')) is the size of output pattern vector.
// W - The weight hypermatrix (initialized first trough ann_BP_init_nb).
result=[];
for y=1:cols
    result = [result ann_FF_run_nb(X(rows-in_count+1:rows,y),N,W)];
end
result=[X; result];

// fix
result=result-0.5;
// concat data and result
resultP = [P(1,:); result];
// change relative values to absolute values
for w=2:(rowsdata+out_count)
    resultP(w,:)=resultP(w,:).*resultP(w-1,:)+resultP(w-1,:)
end

// denormalize
resultP=(resultP-mn1)/(mx1-mn1)*(mx-mn)+mn;

// plot data
pY=1:(rowsdata+out_count);
plot(pY,resultP);
// legend('seria 1','seria 2','seria 3','seria 4','seria 5','seria 6','seria 7','seria 8','seria 9',[3])
disp(resultP);
disp(toc()); // Linux=3.97s Windows=31.84s

On clear Linux Mint 19.3 it takes 3.97s to execute, when the same script on Windows 10 Pro x64 it takes 31.84s. Windows is not anyhow garbled to slow down the process, there are tiny tasks in the background.

The difference between execution times is very strange and it shouldn't be this way. There are no special circumstances to allow that.


01-05-20, 11:45 a.m. pbies

Login to add comment


Log-in to answer to this question.