Processing large database - acceleation
Posted: Fri Oct 09, 2015 5:03 am
Hi,
In my application I processing database with 100000-200000 records. I am using FOXCDX index.
Processing is simple, in cycle "do while" are some mathematical operations , and reading information from other tables is this structure see bottom
This simple cycle consume lot of time. In example is only part of my function, I have several cycles, and processing database with 100000 records consume about 25 minutes ! on PC with core2duo processor. Processor is on 100% too , also on 4 core PC. On my notebook with W8.1 64 bit time processing is 50minutes and it is bad!!!.
Can I do something to speed ? Can help change processing dbf file to processing array and then write to dbf ?
Some time consume seek, then commands replace, append blank,...
Or DBF files (foxpro CDX index) have limits and can help change to other type of database ?
And what will be , if my app must processing database with 500000 record and more ? It will be unbearable.
Thanks
SELECT 1
use data1 index data1
SELECT 9
use data2 index data2
SELECT 1
set order to 1
go top
do while eof()!=.T.
...
rec=recno()
SELECT 9
set order to 3
seek something
if found()
y=y*10
else
y=0
endif
if rlock()
replace x with y,...
else
... warning..
endif
SELECT 1
go rec
skip
enddo
In my application I processing database with 100000-200000 records. I am using FOXCDX index.
Processing is simple, in cycle "do while" are some mathematical operations , and reading information from other tables is this structure see bottom
This simple cycle consume lot of time. In example is only part of my function, I have several cycles, and processing database with 100000 records consume about 25 minutes ! on PC with core2duo processor. Processor is on 100% too , also on 4 core PC. On my notebook with W8.1 64 bit time processing is 50minutes and it is bad!!!.
Can I do something to speed ? Can help change processing dbf file to processing array and then write to dbf ?
Some time consume seek, then commands replace, append blank,...
Or DBF files (foxpro CDX index) have limits and can help change to other type of database ?
And what will be , if my app must processing database with 500000 record and more ? It will be unbearable.
Thanks
SELECT 1
use data1 index data1
SELECT 9
use data2 index data2
SELECT 1
set order to 1
go top
do while eof()!=.T.
...
rec=recno()
SELECT 9
set order to 3
seek something
if found()
y=y*10
else
y=0
endif
if rlock()
replace x with y,...
else
... warning..
endif
SELECT 1
go rec
skip
enddo