SanPrg
asked on
System.OutOfMemoryException
Hello,
I have SQL report pull a data via SProc from 1 table,
That report has 6 columns and 3 row groups.
It works fine when filter for 3 day but when change the filter 7 days or more I got error message System.OutOfMemoryExceptio n.
If I run SProc with same date scope, works fine and retrieve more than 1 million rows.
Thanks
I have SQL report pull a data via SProc from 1 table,
That report has 6 columns and 3 row groups.
It works fine when filter for 3 day but when change the filter 7 days or more I got error message System.OutOfMemoryExceptio
If I run SProc with same date scope, works fine and retrieve more than 1 million rows.
Thanks
Would you like to post the Stored Procedure? This way we may be able to make suggestion that make the it more efficient.
ASKER
Here a go,
CREATE PROCEDURE [dbo].[rptParm]
@Begin datetime
,@End datetime
AS
BEGIN
SET NOCOUNT ON;
SELECT
cast([Time] as date) [Date]
,[Time]
,SUBSTRING([Path],CHARINDE X('/',Path ,1)+1,255) Path
,[Note]
,[Status]
,[Name]
,SUBSTRING([Client],CHARIN DEX('\',[C lient],1)+ 1,255) [Client]
FROM [DWClient]
where cast([Time] as date) between @Begin and @End
Order by [Time],SUBSTRING([Client], CHARINDEX( '\',[Clien t],1)+1,25 5)
END
GO
CREATE PROCEDURE [dbo].[rptParm]
@Begin datetime
,@End datetime
AS
BEGIN
SET NOCOUNT ON;
SELECT
cast([Time] as date) [Date]
,[Time]
,SUBSTRING([Path],CHARINDE
,[Note]
,[Status]
,[Name]
,SUBSTRING([Client],CHARIN
FROM [DWClient]
where cast([Time] as date) between @Begin and @End
Order by [Time],SUBSTRING([Client],
END
GO
If [Time] is not a date or does not have the format "YYYYMMDD" then I am afraid no amount of indexes are going to help you here. The best you can do is create a persisted computed column with a value of CAST([Time] as date) and call it [Date2] You can then index that computed column and modify your Stored Procedure as follows:
CREATE PROCEDURE [dbo].[rptParm]
@Begin datetime,
@End datetime
AS
SET NOCOUNT ON;
DECLARE @BeginDate date = CAST(@Begin AS date),
@EndDate date = CAST(@End AS date)
SELECT CAST([Time] AS date) [Date],
[Time],
SUBSTRING([Path], CHARINDEX('/', Path, 1) + 1, 255) Path,
[Note],
[Status],
[Name],
SUBSTRING([Client], CHARINDEX('\', [Client], 1) + 1, 255) [Client]
FROM [DWClient]
WHERE Date2 BETWEEN @BeginDate AND @EndDate
ORDER BY [Time],
SUBSTRING([Client], CHARINDEX('\', [Client], 1) + 1, 255)
GO
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
You have a system here that has data at one end and a report at the other. The report has a tiny amount of data (but lots of information) and the database has a huge amount of data.
Somewhere in the middle (the stored procedure and SSRS) the data is getting aggregated and reduced so that you get information out. If you do more of that in the stored procedure then the server will be doing the work. If you shift the load out to the SSRS system then it'll be the machine running SSRS that does the work (and the rendering, etc.). It looks like you should be doing more in the stored procedure part, rather than dumping a million rows onto SSRS!
hth
Mike
Somewhere in the middle (the stored procedure and SSRS) the data is getting aggregated and reduced so that you get information out. If you do more of that in the stored procedure then the server will be doing the work. If you shift the load out to the SSRS system then it'll be the machine running SSRS that does the work (and the rendering, etc.). It looks like you should be doing more in the stored procedure part, rather than dumping a million rows onto SSRS!
hth
Mike
ASKER
Guys sorry for delay, I delete the index "Order by" from SProc and I removed all aggregation from the report but still I got same error message.
I found article online http://msdn.microsoft.com/en-us/library/ms159206.aspx
and I changed some configuration in RSReportServer.config but still same problem.
what's the scenario for SSRS with a huge data like a billion rows?
Really I need answer for this question, I hope you can help me.
Thanks
I found article online http://msdn.microsoft.com/en-us/library/ms159206.aspx
and I changed some configuration in RSReportServer.config but still same problem.
what's the scenario for SSRS with a huge data like a billion rows?
Really I need answer for this question, I hope you can help me.
Thanks
ASKER
Thanks SouthMod.
Really I need answer for this question, I hope you can help me.
I am sorry you did not like my solution.
Best of luck.
I am sorry you did not like my solution.
Best of luck.
ASKER
acperkins,
"I am sorry you did not like my solution."
do like or do not like, Did you test your "solution" with a billion rows in SQL Report,
I am sorry I tested your solution and I got same error message.
Thanks for your effort
"I am sorry you did not like my solution."
do like or do not like, Did you test your "solution" with a billion rows in SQL Report,
I am sorry I tested your solution and I got same error message.
Thanks for your effort
Very nice. Thank you.
SanPrg,
I would say that a million rows was too much for SSRS. As before, I would strongly suggest that you move the work on the data back into the stored procedure rather than have SSRS try to handle it, which it obviously can't!
hth
Mike
I would say that a million rows was too much for SSRS. As before, I would strongly suggest that you move the work on the data back into the stored procedure rather than have SSRS try to handle it, which it obviously can't!
hth
Mike
ASKER
Mike,
As I said before, I removed all aggregations just for test and the report works for 7 or 10 days but mot more, I agree with I have a huge but I need to know
Does SSRS work with a huge data?
what's the scenario for SSRS with a huge data like a billion rows?
Thanks Mike
As I said before, I removed all aggregations just for test and the report works for 7 or 10 days but mot more, I agree with I have a huge but I need to know
Does SSRS work with a huge data?
what's the scenario for SSRS with a huge data like a billion rows?
Thanks Mike
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
Mike,
Your links have a good information,
Thanks for all of you.
Your links have a good information,
Thanks for all of you.