You are not logged in.

#1 2014-11-15 12:41:12

alex.lechner
Guest
From: Australia
Registered: 2013-06-12
Posts: 17
Website

Dealing with Large datasets

Hi

Just wondering if there is anyway to increase the size of the input layers I am processing.

I receive the following error when I run large datasets and/ sometimes when I include a resistance surface with a large dataset.

"java.lang.RuntimeException: java.lang.OutOfMemoryError: Java heap space"

This appears to occur after a relatively short amount of processing time once Graphab starts creating linkset (~5 min).

I was wondering if there is anything I can do to address this memory limitation other than reducing the pixel size.

I am running the following version of java:

java version "1.7.0_40"
Java(TM) SE Runtime Environment (build 1.7.0_40-b43)
Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode)

On the following PC:

Intel i7-3770K CPU @ 3.50Ghz
32 GB ram
64 bit windows

BTW really enjoying using the software.

Cheers,

Alex

Offline

#2 2014-11-24 10:40:18

admin
Graphab Dev
Registered: 2013-05-28
Posts: 25

Re: Dealing with Large datasets

Hi,
The linkset creation is the most memory expensive feature.
If you use more than one processor, you may decrease the number of processors used (in menu File -> Preferences).
For each processor used, Graphab allocate a raster in memory. If you decrease the number of processors, you decrease also the memory allocated by Graphab for linkset creation.

Gilles

Offline

#3 2014-12-07 10:09:32

alex.lechner
Guest
From: Australia
Registered: 2013-06-12
Posts: 17
Website

Re: Dealing with Large datasets

Thanks for the quick response.

I noticed that I can't seem to increase the memory size past less than ~725MB even though I have 32GB of memory. I have been able to alter the number of processors.

Thanks

Last edited by alex.lechner (2014-12-07 10:21:01)

Offline

#4 2014-12-11 16:31:08

admin
Graphab Dev
Registered: 2013-05-28
Posts: 25

Re: Dealing with Large datasets

If you are unable to increase memory beyond 1GB, you are using Java 32 bits.
Check after launching Graphab in the task manager if the process name of java.exe ends with *32.
If it is the case, Graphab is launched by a 32 bits version of Java.
Maybe, you have installed several Java, some 32 bits and others 64 bits.
You can try to remove all your versions of Java and install only one 64 bits version.
Gilles

Offline

#5 2014-12-17 12:21:14

alex.lechner
Guest
From: Australia
Registered: 2013-06-12
Posts: 17
Website

Re: Dealing with Large datasets

Thanks. I uninstalled the other versions of Java as suggested. And I was able to raise the maximum memory to 3005 MB. I have 32GB  of ram so I guess this is Java or graphab limitation. Or am I doing something wrong?

Alex

Offline

#6 2014-12-17 15:12:38

admin
Graphab Dev
Registered: 2013-05-28
Posts: 25

Re: Dealing with Large datasets

Java 64bits and Graphab can allocate more than 3GB.
Have you check if the process name of java.exe ends with *32 ?

Offline

#7 2015-03-30 23:19:42

alex.lechner
Guest
From: Australia
Registered: 2013-06-12
Posts: 17
Website

Re: Dealing with Large datasets

Hi sorry for the long delay (new job!)

I have got it working - not sure exactly how. I had a bit of problem previously, as I needed to have two version of Java installed -  32 bit for my browser and 64 bit for Graphab.

Anyway ssems to work now.

Thanks

Offline

Board footer