Listviews of big files are taking too long to load

More advanced topics discussed.

Moderators: Susan Smith, admin, Gabriel

Post Reply
Susan Smith
Posts: 717
Joined: Sun Aug 10, 2008 4:24 am
Location: Southern California

Listviews of big files are taking too long to load

Post by Susan Smith »

Hi all,

I am using ScreenIO to create my listviews, though I don't know that this is a ScreenIO issue. One client has about 95,000 items in their inventory. They want a listview to select items when they are doing invoicing, or other tasks so that they don't have to refer to a printed list. (Makes sense...it IS 2013 after all...)

But it takes so long to load that listview that it's not practical to use. I'm not sure where to go from here. I can cut it down somewhat by asking them to purge some old information out of the inventory. (They have been keeping some sold items for some manual sales analysis purposes), but even then, it's still unacceptably slow to use as a "pick list". I'm dreading their phone call when they ask for a listview for sales history next. That's even worse.

Any ideas? I'm sure that many of you are using larger files than this. Do I have to write the listview myself for this one, rather than ScreenIO, so do something different to speed it up?

-- Susan
gtisdale
Posts: 218
Joined: Sun Jun 07, 2009 7:54 am
Location: Concord, Massachusetts
Contact:

Post by gtisdale »

I am not using ScreenIO so I'm not sure that I can be helpful. However, when I have this situation, on manufacturing client in particular stands out, I have set a few fields in the file that can be used for "pre-editing" and speed up the process.

For example, if a part is obsolete I mark it aasa such and when I read the file obsolete items are ignored in building the querey list, but till available for analysis reports. Similarly, if a part is a compnent and not sold by iteself its status as a non-slable compnent is flagged in a field so that it also can be excluded from the pick list.

If you are building the list yourself it is quicker to increment the size of the arrays in big chunks, say 1,000 records at a time and then fill those in with reads rather than incrementing by one each time you read a record and add it to the list.

FNGeorge
GomezL
Posts: 258
Joined: Wed Apr 29, 2009 5:51 am
Contact:

Post by GomezL »

My first question is "What is unacceptably slow?". Ultimately, it's all relative.

My second question, If this is a multi-user application, are you using client server?

I would expect 90K entries to take about 14 seconds to load in Client Server Mode. The same query would take 125 seconds for 90K entries using "Distributed Mode".

90K entries is probably too many entries to display at a single time. As George mentioned, providing some type of filter would reduce the list to something "Manageable".

Having said that, we do have some pretty large lists ourselves (Usually 5K-10K), we have a library that builds that list into arrays, and then re-uses that array as needed. This way, only the first query takes a long time (About 3 seconds in client server).

Make sure that you open the table read only to get the "Fastest Possible performance".
Susan Smith
Posts: 717
Joined: Sun Aug 10, 2008 4:24 am
Location: Southern California

Post by Susan Smith »

Thanks George and Luis. Your input is very helpful. Unfortunately, I may need to rethink this. There are some situations where I can filter out many records, and other situations where I can't. I may just have to create several different variations of this listview and add some extra index files that might allow me to start with a SEARCH>= in some situations to bypass some records. I will also suggest to the client that they purge some of the old data out of this file. There are inventory items that sold a long time ago, but are still in the file because they run sales analysis on this data. Best to think of a new way to do that :)

You have both given me some valuable food for thought. Thanks.

-- Susan
GomezL
Posts: 258
Joined: Wed Apr 29, 2009 5:51 am
Contact:

Post by GomezL »

You just answered your own question!

Make a new field "LEGACY_INVENTORY", or something like that. Then you can filter those records out (Sales Analysis should still work).

You can add an index by this field, then the list view will only Display Active_Inventory.
Susan Smith
Posts: 717
Joined: Sun Aug 10, 2008 4:24 am
Location: Southern California

Post by Susan Smith »

Yes, Luis that is my new plan. To summarize for anyone interested:

It wasn't enough to filter records out of the listview by not including them in the array because that STILL required that I READ all 95000+ records to determine which ones to include and which ones to omit. The READING of the records took way too long even if they didn't display in the listview.

The answer is to AVOID reading all of the records if you can. In my case, I have a field in my inventory file for "CurrentBalance". What I will do is create an index for that file whose first component is CurrentBalance. I will start reading the file at CurrentBalance > 0 (by resetting the file with RESTORE/SEARCH). That way, I will skip past the majority of old "legacy records" (as Luis put it), cutting down drastically the time needed to read through the file. But to make this listview useful for those times that I DO need to load all of the records, I added a checkbox on the screen for "Include zero balance items". If they NEED to see everything, they can click on that checkbox and the listview will reload with ALL records. But at least I have a chance to warn them that this will require some additional time. Everybody wins.

And to take it a step further:
In the event that some day they actually purge some of this old inventory, and now WANT to routinely see ALL items - zero balance or not - when they first pop this listview up (without the need to click the checkbox and RELOAD the listview), I will store the default value for "include zero balance items" in a control file of some sort that they can get to. Then the control for the default behavior of this listview is in the hands of the user(s).

Thanks again Luis. It helped me formulate my plan by trying to explain it here. Maybe this will help somebody else with a similar "large file" challenge.

-- Susan
dmeenen
Posts: 50
Joined: Mon Jun 08, 2009 8:34 pm
Contact:

Post by dmeenen »

Susan,
I have a listview in Payroll Maintenance that lists the payroll checks for each individual employee when you are maintaining that employee.
The method I used was to sort the file first with what I wanted in the listview.
Sort Entire Check file by: Only this employee, only payroll checks for this employee.
Then display the listview with the sorted file.
It takes about 2 seconds.
Don't know if this applies to your situation, but it is fast.
Susan Smith
Posts: 717
Joined: Sun Aug 10, 2008 4:24 am
Location: Southern California

Post by Susan Smith »

Doug, that's a really interesting idea. I will test that to see how fast it is on this file. I have been abandoning "sorts" in favor of "indexes" in recent years, but I guess there is still a place for ALL of the tools!

Thank you!

-- Susan
gordon
Posts: 358
Joined: Fri Apr 24, 2009 6:02 pm

Post by gordon »

One of the problems that pertains to giving people access to extrely large amounts of data is that listviews can become unwieldy if they are too large. There is simply too much data to review in order to find what you are looking for.

At the last conference I demonstrated a technology that facilitates the interrogation of extremely large files by dividing the file up into one hundred sections and letting the user select a section to expand and then dividing that section up into 100 sections. That method of presentation continues until a certain threshold, say 2000 records, is reached wherein the entire section is presented.

This technique only reads the records presented to the user. One of the problems associated with this technique is that large databases that are continuously updated cannot be quickly segmented in the manner described. The technique depends on binary searching an ISAM index. However, a convenient way around that problem is to use a static index for the segmentation, but use a dynamic index for low level contiguous record reading.

I never perfected the technique in a manner sufficient for production because it didn't seem to generate interest.
gordon
Posts: 358
Joined: Fri Apr 24, 2009 6:02 pm

Post by gordon »

Correction. I'm not sure what I was thinking of a moment ago, but it is not necessary to binary search for this process to work. The ISAM index is segmented mathematically. I think I may have recalled our first attempt using a B-Tree index to do the same thing.

One more thing, using this method in conjunction with use of the BR Filter capability at the low level contiguous presentation can be an extremely powerful technique. This is where rows displayed in a listview can be restricted to rows containing one or more partial keywords.
Susan Smith
Posts: 717
Joined: Sun Aug 10, 2008 4:24 am
Location: Southern California

Post by Susan Smith »

Gordon, I'm not sure what you mean by "use a static index for the segmentation, but use a dynamic index for low level contiguous record reading." I guess I don't know what dynamic or static indexes are. Do you mean indexes that are routinely kept on a file vs. indexes that are created on demand for these look up purposes?

In the meantime, I have convinced the client that it is in their best interest to pare down this file. In their case, much of the file is old inventory items - product lines long since sold. And although they wanted to keep this on file for sales analysis purposes and trends, I have suggested that they keep a PORTION of it instead since sales trends based on 15 year old inventory are probably not as helpful as those that are built on 3-5 year old inventory. I have suggested that we offload the purged records to an archive file that can be merged with the "current" inventory file for reporting purposes at some point in the future if they ever really determine that this is necessary (which I suspect won't happen). That safety net made a big difference.

When I added that even Excel has a limit of 65,536 rows and to load a listview with 95000+ records will take 20-30 seconds every single time they want to look up an item # - even on a single user LOCAL computer, they began to see this huge inventory file for what it is...overkill. In fact, since we discussed these issues at length with me explaining the technical side of it and they explaining their business goals, this project has become even more of a partnership. Clients can be quite willing to work within limitations when they understand the constraints and are respected enough to be brought into that discussion.

Since I can pare down the listview loading response time even more by skipping past zero balance inventory records using SEARCH >=, this has helped enormously.

-- Susan
Post Reply