Developing Business Advantage
Areas of Interest NEWS/BLOG
You are logged in as: 54.166.245.10
 
Crystal Reports Limited/Paged Dynamic Parameters Lists
Date Posted: December 11, 2013
Crystal Reports Limited/Paged Dynamic Parameters Lists
A number of years ago SAP introduced a limit to the number of values that appear in dynamically populated parameter lists. The reason for this was to prevent a report containing a parameter with an overly large dataset from causing the parameter prompt from crashing or taking a very long time to load. For the rest of us with reasonably sized dynamic parameters this limitation is quite annoying. Here is how you can correct/adjust the rules governing the list:

Changing the number of pages that will be available:
Crystal Reports 2013
Create a registry key “HKEY_LOCAL_MACHINE\SOFTWARE\Business Objects\Suite 12.0\Crystal Reports\DatabaseOptions\LOV”
Crystal Reports 2008
Create a registry key “HKEY_LOCAL_MACHINE\SOFTWARE\Business Objects\Suite 12.0\Crystal Reports\DatabaseOptions\LOV”
Crystal Reports XI Release 2
Create a registry key HKEY_LOCAL_MACHINE\SOFTWARE\Business Objects\Suite 11.5\Crystal Reports\DatabaseOptions\LOV.

Add a string value "MaxRowsetRecords" and set the value to the maximum number of values that you desire for your report. For example, a value of 2000 will return up to 2000 values in the lowest level of a cascading parameter.
NOTE: The value 0 (unlimited) will not work with BusinessObjects Enterprise XI or Crystal Reports Server XI, you must specify another value.

Changing the number of records per page within a dynamic parameter:
Crystal Reports 2013
Modify the registry key “HKEY_CURRENT_USER\Software\SAP BusinessObjects\Suite XI 4.0\Crystal Reports\ReportView”. Change the value “PromptingLOVBatchSize” to decimal 800 (default is 200). We have encountered problems where Crystal Reports will revert back to “200” if you set the value too high.
 

 
Real-Time Paperwork
Date Posted: March 20, 2013

I was recently working through a large pile of paperwork that contained customer orders that would be in various states: open, in progress, completed, etc. Being somewhat impatient I was getting tired of having to be near a computer to re-enter the order number only to find out the status of the order documented. I thought it would be nice if I could use my phone to scan a barcode on the document and have my phone automatically retrieve the real-time status of the order without typing anything.

To solve this challenge we created a web page requiring no user intervention. This page accepts the information from the barcode and displays the real-time status of the document without the need to go to a computer or login to a web site. The user just scans the barcode and the result appears on the phone with no further work. It is proving to be a great technique for processing paperwork quickly based on real-time information and you don’t need to be sitting at your computer!




 

 
Crystal Reports & Time Durations
Date Posted: June 05, 2012
While summarizing employee hours for a client I was tasked with formatting aggregate time information in ‘Hours’ and ‘Minutes’ notation for improved readability. Although Crystal Reports has a series of different functions that allow formatting of dates and times there are no special functions that address durations. As a result I wrote the following formula in Crystal Reports that is able to take duration (in minutes) and translate that to “X Hours Y Minutes”. Here is the formula to accomplish this:


whileprintingrecords;
local numbervar DurationInMinutes;
local numbervar Hours;
local numbervar Minutes;

DurationInMinutes := Sum (INSERT YOUR TOTAL TIME IN MINUTES HERE);
Hours := FLOOR(DurationInMinutes /60);
Minutes := REMAINDER(DurationInMinutes ,60);
SWITCH
(
Hours < 1, TOTEXT(Minutes,0,"") + " Minutes ",
Hours >= 1 AND Minutes < 1, TOTEXT(Hours,0,"") + " Hours",
Hours >= 1 AND Minutes >= 1, TOTEXT(Hours,0,"") + " Hours " + TOTEXT(Minutes,0,"") + " Minutes",
TRUE,""
)

 

 
Integrating IIS IP Restriction Management with .NET Applications
Date Posted: April 05, 2012
For a number of reasons I believe R&D meetings are some of the most fun of any meetings we have at our office. During a recent meeting we were talking about ways to improve security of our web application when a specific module/section should only ever be accessed from within the corporate LAN. In some cases this is due to the bandwidth intensity of that section or it could be security control that helps govern access beyond a user-level. Although there are numerous ways to accomplish this manually we could not find an effective solution that would allow application administrators to manage it from a web interface. Conventional solutions involved firewall rule changes or modifications from within IIS. Both of these solutions are manual and require server access beyond the scope of the application.


We decided to create a component that would handle this process and expose the management of IP restrictions from within our applications standard administration components. As a result application admins can decide who can access the all or part of our web applications and from which IP address. This completely eliminated the need for application administrators to modified IP address restrictions using the IIS management interface since they are not often qualified or allowed to make changes in IIS directly. The solution also integrates directly into the application so we can control internal & external access in the context of the application instead of a broad rule that is all or nothing.

Taking this one step further we also added the ability to specify the duration of the IP restriction. Since IIS does not have a scheduler integrated with the IP restrictions our component handles this and manages the implementation automatically. This allows us to grant some users permanent access and other temporary access based on a window of time.

Of course this is just one of many layers of security that should be considered. With this tool we can now take advantage of IP restrictions in environments that traditionally would have been too cumbersome to do so. So far it is proving to be a powerful new tool!
 

 
Crystal Reports: Reasons for excessive & unexplained load on SQL Servers
Date Posted: March 28, 2012
While analyzing a couple SQL Servers for performance issues we uncovered a severe performance problem when running Crystal Reports. We were running through a particularly complex Crystal Report trying to evaluate index performance using SQL Profiler when we noticed more than 75 entries with the following:


exec [MYDATABASENAME]..sp_tables_rowset N' MYTABLENAME ',N'dbo',NULL
exec [MYDATABASENAME]..sp_procedures_rowset N' MYTABLENAME ',1,N'dbo'
exec [MYDATABASENAME]..sp_columns_rowset N' MYTABLENAME’,N'dbo',NULL

Some of the SQL queries utilized CPU while all queries performed a number of reads of the database. The cumulative amount of CPU time totalled 200 while the database reads were more than 2000. This load on the server does not include the actual SQL queries that were then executed afterward to gather the real data to be retrieved. When you consider this is one user running one report and extrapolate that load in a production environment the results are concerning!

For those unfamiliar with the procedures “sp_table_rowset”, “sp_procedure_rowset” and “sp_columns_rowset” should know these are system stored procedures that are responsible for retrieving internal information between the SQL Server and the OLEDB or Native Client data driver. Essentially they manage various metadata about the tables, views or stored procedures that your data driver is trying to interact with. Although the purpose of which seems harmless enough the frequency and cumulative load placed on the SQL Server justified further investigation.

We were able to quickly determine from the particular tables & views referenced and subsequent queries executed that almost all occurrences related to Crystal Reports that exist within our applications. Since the purpose of this ‘extra work’ on the data server seemed to relate to the system retrieving metadata I postulated that some sort of table/data verification mechanism was at fault. Knowing that Crystal Reports provides a function known under many labels as “Verify Database” I assumed this was the best place to start looking. Sure enough using the Crystal Reports designed I verified a correlation but this did not answer the question of why our applications were incurring this extra load at run-time since those reports are loaded in memory.

We reviewed our code and confirmed that setting the table location with the following code seemed to be one initiator of the problem:

C#:
ConnectionInfo ci = new ConnectionInfo();
ci.ServerName = this.tbServerName.Text.Trim();
ci.DatabaseName = this.tbDatabaseName.Text.Trim();
ci.UserID = this.tbUserID.Text.Trim();
ci.Password = this.tbPassword.Text.Trim();
info.ConnectionInfo = ci;
item.ApplyLogOnInfo(info);

VB6 (RDC):
crxDatabaseTable.Location = Trim(YourDBNameHere) & ".dbo." & crxDatabaseTable.SubLocation

This code is required to force any live application to redirect the Crystal Report to the live database. It is very common that a Crystal Report will be created in a development environment and then deployed to the live environment. This is true whether you are working with custom or off-the-shelf software. As a result the live application must redirect the report to the correct database at run-time every time!

In an effort to eliminate this excessive load on the server we modified the report to point to the live database (instead of the default development database) and reran the report. We noticed the excessive load on the server was cut in half! This was a great start but we wanted to eliminate the unnecessary load entirely. We then dug through the Crystal Report and confirmed that both the main report and any subreports (of which there were two) had the following settings turned ON:

“Verify on First Refresh”
“Verify Stored Procedures on First Refresh”

After turning these setting OFF we then redeployed and confirmed the excessive queries on the server were completely eliminated! As it turned out by setting the data source location at runtime caused one full pass of all table/view/procedure objects in the Crystal Report. Another pass occurred for any database objects in the subreports and another pass was made for the database verify on the main report and again for subreports. A report with two tables and one subreport containing another two tables caused four full passes on all database objects. The performance implications made worse depending on the complexity of the objects processed. If you have a SQL view or stored procedure that accesses a lot of tables then each of those tables will be included on these multiple passes. This proves to be a performance nightmare that is difficult to eliminate.

The moral of the story being that it is very important to know how your Crystal Reports were associated with their data sources and not rely on the applications to overwrite the data source. In an environment with numerous Crystal Reports being executed throughout the day this load can have a real impact on your servers!
 

 
Windows 2008 R2 Installation & Unusable Drive Partitions
Date Posted: March 13, 2012
We recently uncovered a major annoyance while setting up a Windows 2008 R2 x64 server for testing purposes. Since this server installation would be used as a test for Exchange server 2010 we decided to purchase a new hard drive (Seagate SAS 300GB 15K rpm). This will be of interest sake later in the story as the drive was supposed to be a clean, non-refurbished drive.

Immediately we hit a wall, the drive was not visible to the installation process. The Windows installer claimed no drive existed. We verified the SAS controller driver was loaded properly for the Windows installation. We also booted the drive as a secondary with an existing drive that we knew to be working. After doing this we confirmed our new hard drive was accessible and working. Using the existing OS we formatted the drive without a problem and confirmed the drive as functional using the SAS BIOS verification process. At this point we trusted our drive is functional.

We shutdown the server, removed the other hard drive leaving only our new drive in the system and reattempted in the Windows 2008 R2 installation. This time the drive appeared showing the partition we had created from the other OS. Just for habits sake we deleted the existing partition and recreated. Windows prompted us the with the following notification:

Windows might create additional partitions for system files

Remembering this to be the silly 100MB system partition that Windows seems to think useful I told the installer to proceed however nothing happened. Usually this is where we would see two partitions created, one for the system partition and the other as the primary partition for the OS but this was not the case. After repeating this a few times we decided to create the 100MB partition manually and then try to assign the remaining space to our primary partition. The installer let us create the 100MB partition without a problem however we noted a small footnote on the screen claiming the following:

Windows is unable to install to the selected location. Error: 0x80300024 (more info).

When clicking on the ‘more info’ link we received the following warning:

Windows cannot be installed to this disk. The selected disk is of the GPT partition style.

Thankfully this was the clue we needed. For whatever reason the disk was designated as a GPT disk. Any partitions we created, even using the Windows installer, carried the GPT designation. Since GPT partitions (as flexible as they may be) do not allow for hidden partitions, a requirement of the Windows ‘Hidden’ system partition, the disk could not be used in its current state. A more relevant error handling process seemed in order from Microsoft.

The solution: Take the hard disk and connect it to another machine/existing OS as a secondary drive. Ensure that Windows sees the disk as a non-GPT disk (I.E. MBR partition style). Take your new drive (no partitions necessary) to the new installation and now Windows will be able to create that hidden partition properly and your new OS will install without a problem.

For more information on how GPT partitions work see here:
http://msdn.microsoft.com/en-us/windows/hardware/gg463525
 

 
Calculating Work Week or 7 Day Week
Date Posted: August 26, 2011
Using T-SQL I was working with a number of date ranges with the purpose of calculating various durations. After building a significant query that aggregates various counts of days we were then tasked with including a breakdown based on a 5 day work week compared with a 7 day work week.

In the past I have a used a function I wrote that helps me breakdown work days and non-work days. In this context a function would start introducing some performance issues that I would rather not incur. To accomplish this I used the following query:

SELECT
DATEDIFF(day,@STARTINGDATE, @ENDINGDATE)+1 AS SEVENDAYWEEK,
(DATEDIFF(day,@STARTINGDATE, @ENDINGDATE)+1)
- (DATEDIFF(week, @STARTINGDATE, @ENDINGDATE)*2)
- (CASE WHEN UPPER(DATENAME(dw, @STARTINGDATE)) = 'SATURDAY' THEN 1 ELSE 0 END)
- (CASE WHEN UPPER(DATENAME(dw, @ENDINGDATE)) = 'SUNDAY' THEN 1 ELSE 0 END) AS FIVEDAYWEEK
FROM MYTABLE
 

 
Crystal Reports “Keep Together”: A Simple Explanation
Date Posted: August 24, 2011
Quite often we are asked by clients and junior Crystal Report designers why the “Keep Together” feature doesn’t work. In even the most complicated Crystal Reports the solution is usually quite simple, requiring the basic understanding of how Keep Together is applied to the report. Here is a quick breakdown:

1) “Keep Together” defined from the section expert affects the specific section for which it is set. Example: If you turn on “Keep Together” for group header #3a this will only affect the section group header #3a and have no effect on any sections below. It will also not have any impact on group footer #3.

2) “Keep Together” defined from the “Change Group Options” does affect the group as a whole. This will cause the Crystal Reports engine to try and keep the group together regardless of how many sections are contained within and below that group.

The Keep Together feature is great however consider that it makes a best effort to keep objects and sections together. If the object or section that you want kept together consumes space that is greater than is physically available on the page then naturally it will be distributed across pages.

Remember: The behavior of a specific section relying on “Keep Together” can change depending on whatever other “Keep Together” settings you have specified in other areas of the report. This hierarchy can get complicated on reports with many groups/sections relying on the “Keep Together” functionality.
 

 
Crystal Report Engine Processing: Client or Server
Date Posted: August 05, 2011
When it comes to troubleshooting poor Crystal Report performance one of the key elements designers should try to focus on is where the actual work is being performed: the server or the workstation. For numerous reasons you will likely want your server to figure out as much as possible before handing of the remaining work to Crystal Reports. This is beneficial because it reduces load on the workstation, the network and the duration of the report. This is also desirable since your server it likely far more powerful than your desktop.

A very easy way to identify if the server is performing the majority of the work (aside from watching performance monitors) is to use the “Show SQL Query” feature in the Crystal Reports designer. This will show how Crystal Reports has interpreted your query as defined in your selection formula. Here is a sample of a selection formula and subsequent SQL query to demonstrate:


As you will see the SQL Query has appropriately applied your criteria. This will result in the server determining which records your report will process.


The next example shows a slight change where we are now using more than simple operators in our selection formula. Notice how the SQL Query has changed.



Crystal Reports is no longer passing the criteria to the SQL Server and instead will request the entire contents of the table and process it on the workstation. Very slow.

The moral of the story, well written selection formulas can make all the difference in the final performance of your reports.
 

 
Duplicate a Business Transaction within SQL without a T-SQL Cursor
Date Posted: August 05, 2011
On occasion clients ask for a ‘feature’ that will allow them to create a new transaction in their application that is almost an identical copy of an existing transaction. This new record should be virtually identical with the exception of the primary keys and the odd date field.

The client wants this for efficiency reasons since hundreds of fields will be identical between the transactions and simply want to save their users time. Of course this is far more complex in the backend when that transaction is comprised of data spanning dozens of tables and hundreds of columns. A SQL cursor with its manual declaration of columns was not looking like much fun. It would get the job done but take a long time to write and would be tedious to read.
The solution was a simple (but somewhat long) SQL query that processes each of the related tables and injects the relevant data into a table declared in memory within the query. By creating one in-memory table with the single primary and foreign keys replaced with ‘NewID’ and ‘OldID’ we could prepare our new data first and then perform a simple “INSERT INTO MyTable SELECT” query from the in-memory table into the live table. Of course all of which wrapping a transactions.

Here is a snippet of the T-SQL code to accomplish this:
--REPORT_ITEM: Prepare
INSERT INTO @tmpItemTable
SELECT (@MaxItemID+(ROW_NUMBER() OVER(ORDER BY ITEMID ASC))) AS NEWITEMID,
ITEMID,
REPORTID,
SomeTextValue
FROM REPORT_ITEM
WHERE
REPORTID = @SOURCE_REPORTID

--REPORT_ITEM: Insert
INSERT INTO REPORT_ITEM
SELECT NEWITEMID AS ITEMID, REPORTID, SomeTextValue
FROM @tmpItemTable

The complete sample code can be downloaded here. This code includes logic to handle not only the primary key but also tables with foreign key relationships and still maintain the appropriate integrity of the relationships.

TSQLCopyPrimaryKey.zip
 

 
SQL: Multiple conditonal aggregations using one query
Date Posted: July 21, 2011
While analyzing a database looking for performance issues I ran across some common queries that seemed a little too intensive while trying to get a series of important date from the database. The purpose of these queries was simple enough: return the first date indicating a particular event or occasion. There were a number of different types of ‘events’ or ‘occasions’ in the database that were indicated by a specific column indicating the type of date. Here is a quick sample table for demonstration purposes:

CREATE TABLE [DatesTable]
(
[ID] [int] NOT NULL,
[DATETYPEID] [int] NOT NULL,
[FOREIGNKEY ] [int] NOT NULL,
[MYDATE] [datetime] NOT NULL
)

The application called multiple queries using the following method:
SELECT TOP 1 MYDATE FROM DatesTable WHERE DATETYPEID = 1 AND FOREIGNKEY = 123 ORDER BY MYDATE ASC
SELECT TOP 1 MYDATE FROM DatesTable WHERE DATETYPEID = 2 AND FOREIGNKEY = 123 ORDER BY MYDATE ASC
SELECT TOP 1 MYDATE FROM DatesTable WHERE DATETYPEID = 3 AND FOREIGNKEY = 123 ORDER BY MYDATE ASC

Since we noticed a total of seven specific DATETYPEID values the grand total of total index scans was 7 with a combined logical reads of 511. Although CPU was negligible (indexes were working well) this group of queries was run thousands of times throughout the day and would add up in terms of load on the server.

Fortunately we were in a position where we have access to the code and could rewrite the queries. As a technical challenge we wanted to see if we could get a single query to satisfy the required functionality thus reducing load. Here is what we came up with:
SELECT MAX(VAL1), MAX(VAL2)
FROM
(
SELECT
MIN(
CASE WHEN [DATETYPEID] = 1 THEN MYDATE
ELSE NULL
END
) AS VAL1,
MIN(
CASE WHEN [DATETYPEID] = 2 THEN MYDATE
ELSE NULL
END
) AS VAL,
(....trimmed for readability sake....)
FROM DatesTable
WHERE FOREIGNKEY = 123

The result: a single index seek with only 73 logical reads - a significant reduction!
 

 
SQL Server Setup & ERP Deployments
Date Posted: July 12, 2011
We are frequently asked by ERP consultants why their software (I.E. Sage Accpac, Great Plains, etc) are unable to communicate with a SQL Server in a new environment. Most often these issues have common solutions. Here are some of the most common issues (and solutions) we encounter:

1) SQL Instance Name
During installation of SQL Server manually specify & document your instance name. Confirm that you can connect to the instance name from SQL Management Studio. Do not rely on the default (aka “(local)”).

2) SQL Services
Make sure the SQL Services (including the SQL Browser service) have been started on the SQL Server.

3) Security
Do not use the username “SA” for applications authentication. This will give end-users total control of SQL Server and possibly access to other non-SQL areas of your server. You may use this account to ‘test’ connectivity. If the SA account works while other accounts do not you will need to further examine how your account has been configured.

4) SQL Configuration: Network Protocols

Shared Memory:
This protocol is not relevant for most database deployments. Remote requests will not be satisfied.

Named Pipes
This is an authenticated protocol which, as per Microsoft’s recommendation, is one of the fastest protocols and thus the recommended protocol. In a few cases this protocol would not satisfy the requirements of the remote application.

TCP/IP
This protocol is a great candidate as it tends to be more durable in situations where network speed may be variable due to load or distribution (I.E. WAN environment). It is our experience that TCP/IP is the most reliable protocol.



5) Firewall
Ensure the firewall on your SQL server is either disabled or has the appropriate rules configured to allow SQL traffic. Disabling the firewall should be done at the discretion of the network administrator.
 

 
Crystal Reports 2008 "Failed to save document"
Date Posted: May 31, 2011
While working with Crystal Reports 2008 (12.2.0.290) we found an intermittent problem where our reports would be unable to save and produce the error “Failed to save document”. All versions of Crystal Reports ranging from v5 to v2008 have periodically had a problem where you can no longer save your work to the same RPT file as the program would give you an error about being unable to overwrite the file. The old solution was to simply save as another file. Thinking this was our issue we tried to “Save As” but no luck. It was looking as though we were going to lose all of our changes and have to rewrite the report.

Fortunately we discovered a solution. Navigate to “Database\Log On or Off Server” from the file menu. Click on any/all database connections and check to see if you are still logged in. Log off each connection that is active. When finished you should be able to click on all database connections shown in the window and you will no longer have the option to log off.


Now using the “Database\Set Datasource Location” file menu item create a new connection to your database and follow the usual steps to specifying your database connection objects. Problem solved!

Initially we tried to jump straight to the step of using the “Set Datasource Location” feature however this did not solve the problem as it would have in previous versions of Crystal Reports.
 

 
Terminal Server & iPads
Date Posted: May 21, 2011
I have experimented with a few iPad RDP applications in the past and found them to be cumbersome at best. I recently discovered "iTap RDP" (version 1.7.2) works very well. I wouldn't suggest using it to edit Microsoft Excel or Word documents on Terminal Server however if you need quick RDP access to get in and get out quickly this app does the trick nicely. They also provide and iPhone version but I haven't bothered for the obvious screen constraints.

iTap Mobile
 

 
Legacy Application Development: Visual Basic 6 ADO problem with SQL 2008
Date Posted: May 04, 2011
While working with a very old project that uses Visual Basic 6 connected to SQL 2008 in our development environment we ran into an issue where some of our ADO recordset updates were failing with the old error “Row cannot be located for updating”.



After fighting with the issue for a while we discovered a behavioural difference when the VB6 application (using ADO) updated a table containing a trigger. If the source database was on SQL 2000 or SQL 2005 there was no problem. If the database was on SQL 2008 the update would fail with the error noted above.

The solution was to simply add SET NOCOUNT ON within the trigger. Although the purpose of the NOCOUNT setting is no secret, the fact that VB6 and ADO would exhibit different behaviour depending on the database engine is an important point to remember when supporting old VB6 applications connecting to SQL 2008.
 

 
Crystal Reports 2008 Subreport Linking Inconsistency
Date Posted: April 28, 2011
Recently we ran into an issue where a report would suddenly lose the ability to pass down correct data to the subreport links. The result was a subreport that started returning large amounts of data even though the embedded selection formula and related subreport links explicitly dictated very specific criteria for the subreport. What made this issue more confusing was the fact that only after generating around 100 pages of a 400 page report did the problem manifest itself. Since the subreport was processed on each of the initial 100 pages we were perplexed. We also found the problem would only occur when exporting the Crystal Report to PDF as the report would execute perfectly from within the Crystal Reports 2008 designer.

After examining the data to confirm that no data irregularities were at fault we discovered the problematic subreports were linked to a parameter on the parent report. Although this value will be the same throughout the life of the report we discovered only in Crystal Reports 2008 (running the latest version at the time of this writing) if you define the source of your subreport link as the parent report parameter then ‘sometimes’ the related subreport would not only ignore this linkage but ignore the subreport criteria entirely. The following are two screen shots showing the original method that produced inconsistent results and the second method that works consistently.

Incorrect:


Correct:

 

 
Applying Conditional Sort Logic in T-SQL
Date Posted: April 19, 2011
Hopefully in most situations you are working with a proper schema and query logic where elaborate and micromanaged sorting of results are not necessary. As some purists may tell you using an ORDER BY in your SQL queries will slow down your results and add load to your servers worsened by how you are sorting as well as the size of the data to be sorted.

Of course there are those special scenarios where query efficiency needs to take a backseat to query output. If you have a query that requires the dataset returned in a very specific order try throwing a CASE statement, nested query or both within your ORDER BY clause. I should warn you this is not a very efficient method of retrieving data however it is a very simple way of getting very fine tuned control over your output. In the example below you can substitute any logic you wish into the nested CASE statement. By translating your data into a numeric value you can control the order of the output very easily. I have also used a nested query within the ORDER BY to demonstrate how you can go further to create whatever queries you like to achieve the sort order you are looking for.

SELECT P1.* FROM PRODUCT_HEADER P1
ORDER BY
(
SELECT
CASE WHEN PRODUCTID = 5 THEN 1
WHEN PRODUCTID = 6 THEN 2
ELSE 0 END
FROM PRODUCT_HEADER
WHERE PRODUCTID = P1.PRODUCTID
) DESC
 

 
Crystal Reports: Page Headers in Subreports
Date Posted: April 19, 2011
Often the content of subreports embedded within Crystal Reports will inevitably span multiple pages. The Crystal Report designer does not make it clear as to how you can have subreport specific content (I.E. column headers) appear at the top of each page just as it would with a traditional page header.

A quick solution is to create a group in the report and within the configuration of that group turn on the feature named “Repeat Group Header on Each Page”. This will cause the group header to be displayed just below the main reports page headers thus giving the appearance that you have an additional page header.

Be warned! We have seen instances with Crystal Reports 2008 where dynamic suppression formulas embedded within a subreport using this technique may not execute reliably without the addition of a “WhilePrintingRecords” in the suppression formula even if the behaviour of the formula would not ordinarily require “WhilePrintingRecords”. In those cases prior to adding the “WhilePrintingRecords” to the suppression formula the problem manifested itself as simply not generating any further subreport data after the first page yet the remaining portion of the main report continued as though no problem existed.
 

 
Crystal Reports, Sage Accpac & Performance
Date Posted: March 30, 2011
Poor report performance can have many causes. One of the more comment scenarios we encounter relates to Sage Accpac Crystal Reports. Since Sage offers their customers the option of choosing from a number of database engines they must limit the database design to employ only those features that will work across all databases. Unfortunately this has included the data types that can be stored in the database – one of the the more critical being date/time fields.

For years Sage has provided a function named ‘PWFORMATDATE’. Simply put this Crystal Reports function converts the Accpac numeric date field into a true date that Crystal Reports can then use to filter data. The downside: all records from the tables within the report need to be read by the workstation since the workstation is where that function reads and converts the information into a real date. Imaging a simple report showing yesterdays sales having to read through the last 10 years of data just to convert each of the date values and how this might slow down your computer, network and server. Simple solution: perform a numeric comparison in the selection formula or write a database view (SQL) that turns the value into a true date on the server. This avoids your workstations having to pull all of the data across the network and process it locally.

This is one of the more common causes of poor performance we run into while working with Sage Accpac. If you have a report that is running slowly (longer than 15 seconds) contact us to see what changes could make a dramatic difference.