4.7 Performance Issues
Testing and Balancing Across Multiple Platforms
Different operating systems are good at doing different things.
For example, an IBM i is very good at doing database I/Os and Windows is very good at CPU intensive operations like SORT_LISTs.
You cannot sensibly estimate what an applications performance will be like on one platform by testing the application on another platform.
It is your responsibility to find the right balance between differing hardware platforms. Your application design must consider how the platforms are different and determine how to best utilize the specific features the platform has to offer.
Amount of Information in Panels/Dialogs/Windows
Do not overload screen panels, dialogs or Windows.
For example, a screen panel containing 100 fields may be slower to present when executing under Windows compared to an IBM i.
"Drawing" time in advanced graphical environments is directly related to the amount of information (i.e. number of objects) in the windows.
PC, Desktop and LAN Based DBMS
Generally IBM i developers form their perceptions about the speed and methods by which they can access a database table from their IBM i experiences. Unfortunately this may be the worst possible place to gain these perceptions.
The IBM i is a multi-user database machine. Its database access is very fast and relatively little attention has to be paid to the impact of DBMS requests on the system or other users.
However, as we move more and more towards PC, Desktop and LAN based DBMS, developers may have to alter their perceptions and change their habits accordingly.
This is a subjective area, and PC DBMS vendors may dispute this, but development experiences so far have indicated:
- Rarely will a PC based DBMS perform as well as an IBM i DBMS.
- Some PC based DBMS are really only good for the home user, even though they may be sold as being suitable for high volume, multi- user commercial use.
- Most developers test their applications with very small data sets and will not identify a DBMS performance problem area until after the application gets into the hands of production users.
So, in any context where you are using a PC based DBMS, please think about the following before you implement an application:
- There are hardware considerations. For example, an older Pentium PC using an IDE hard drive may have "useable" disk access times 5 times slower than a newer Pentium PC with a SCSI drive / controller / cache.
- Avoid SX computers. When decimal numbers and high volumes are involved make sure that you are using a DX computer that has a floating point co-processor. Cases we have seen indicate that in numerically intensive high volume database access a DX computer may be 4 to 6 times faster than an equivalent SX.
- Think about loads and volumes during the design phase.
- The most likely area of problems is to do with tables that contain large numbers of rows and SELECT commands. Slow SELECT's are the most common cause of PC DBMS problems.
- Be CAREFUL when using SELECT on a table that you know will have a large number of rows. If there is no index to support your SELECT, then the DBMS will read the whole table. If there are 100,000 rows, then this operation WILL take a long time. Even if there appears to be an index, the DBMS may decide to read the whole table anyway.
However, if you are testing with realistic volumes then you can identify and remove the problem area before it gets to production users.
- Be RESPONSIBLE. It is no good designing and implementing an application without any thought for load, volumes and viability and THEN finding it is too slow to be viable, and then complaining to LANSA or to the DBMS vendor. You need to AVOID the situation in the first place.
- SELECT_SQL is the fastest type of DBMS access. It creates imbedded SQL in the generated C code. This is the fastest type of access that can be done. If you can't achieve a viable result using SELECT_SQL then you will have to find an alternative solution to the problem.
- When very large tables are involved, consider placing them on an IBM i server and then using LANSA SuperServer or Remote Function calls to access them, possibly only returning the summarized result(s) in a working list.
- Prototype and test under realistic loads and volumes any table that is expected to have more than 1000 rows in it. This way you should find, and remove, any problem areas before they become critical.
The 1000 figure is totally arbitrary. From your own experiences with some PC based DBMS you may choose to lower this figure to as low as 100 for some PC DBMS, and for others, raise it up to 5000.
The performance of PC based DBMS covers a very broad spectrum. After you have chosen one, you will have to develop a perception of what its capabilities are and thus at what point you have to start being careful.
- Become familiar with your selected DBMS. Read the sections in any guides / manuals that it provides related to performance and tuning. Learn to optimize its performance.
Disk Space Considerations
When using Visual LANSA to develop and test IBM i-based applications, you should not keep source or objects that are not required to execute the application (e.g. .C, .H, .STG, .DEF and .MAK files) on your PC. By removing these files, you will substantially increase the amount of available disk space on your PC.
If you are creating stand-alone applications for Visual LANSA, you should also consider compiling the final version of your programs without debug enabled. Compiling without debug will further increase available disk space.