Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

Modify Existing SP to use pre-calc'd datachange

Posted on 2010-09-10
17
Medium Priority
?
337 Views
Last Modified: 2012-05-10
I currently have a SP in SQL2008 that takes data from a datalogging device(Meter) which logs in 15min samples, and is an accumulating value.

The data is then manipulated into daily, weekly, monthly and yearly results, with costs applied etc..

I'm not chnaging the datalogger to record the datachange, so every 15min, the change from previous to current reading is recorded in the table. So, I need to modify the existing SP, to work out for the new logging format.
USE [EnergySuiteDB]
GO
/****** Object:  StoredProcedure [dbo].[GetCostPerPeriod]    Script Date: 09/06/2010 12:36:19 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[GetCostPerPeriod] @sDate DATETIME,@Meter int,@GasRate int,@pf float,@util varchar(50),@reportselection char(1),@co2 float,@Carbon float,@calc float, @occtl int, @oattl int, @Spacetl int,@eDate datetime

AS
BEGIN

--A power factor of 0 will throw exception, so this needs to be taken care of

if isnull(@pf,0)=0 set @pf = 1

-- declare and set up some variables based on incoming parameters

DECLARE @PrevVal NUMERIC(12, 4)
declare @grate numeric(12,4)
declare @wdate datetime
declare @intervals int

 if isnull(@edate,'19000101') > '19000101' set @sdate = convert(char(8),@sdate,112)   -- make sure time is at "start" of period
 if (isnull(@edate,'19000101') < '19000102'  and @reportselection = 'M') set @sdate = DATEADD(month, DATEDIFF(month, 0, @sdate),0)
 if (isnull(@edate,'19000101') < '19000102' and @reportselection = 'Y') set @sdate = datename(year,@sdate) + '0101'



if isnull(@edate,'19000101') > '19000101' set @edate = convert(char(8),@edate,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection not in ('Y','M','W') set @edate = convert(char(8),@sdate,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'W' set @edate = convert(char(8),@sdate+6,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'M' set @edate = DATEADD(month, DATEDIFF(month, -1, @sdate),0) - 0.00000005
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'Y' set @edate = datename(year,@sdate) + '1231 23:59:59.997'

set @intervals = day(dateadd(month,datediff(month,-1,@sdate),0) - 1) 
If @reportselection not in ('Y','M','W') set @intervals = @intervals * 96  -- 96 quarter hours in a day.
If @reportselection = 'Y' set @intervals = 1

if isnull(@gasrate,0) = 1
   set @grate = 11.0786 -- 1.02264 * 39 / 3.6
else
   set @grate = 1.0


-- Now create our "periods" table to ensure all reporting date / times are being covered in the granularity required.

if object_id('tempdb..#periods','U') is not null drop table #periods
create table #periods (PSDate datetime, PEDate datetime, PSName varchar(30), DayParts decimal(12,6))

set @wdate = @sdate

While @wdate <=  @edate
begin

   insert #periods (PSDate, PEDate, PSName, DayParts)

   select @wdate as PSDate
          ,case when @reportselection = 'Y' 
                then DATEADD(month, DATEDIFF(month, -1, @wdate),0) - 0.00000005 
                else case when @reportselection in ('M','W') 
                          then convert(char(8),@wdate,112) + ' 23:59:59.997' 
                          else dateadd(minute,15,@wdate) - 0.00000005 
                          end 
                end as PEDate
          ,case when @reportselection = 'Y' 
                then left(datename(month,@wdate),3) + ' ' + datename(year,@wdate)
                else case when @reportselection in ('M','W')
                          then left(datename(dw,@wdate),03) + ' ' + convert(char(8),@wdate,03)
                          else left(datename(dw,@wdate),03) + ' ' + convert(char(8),@wdate,03) +right(convert(char(19),@wdate,100),8)  
                          end 
                end as PSName
          ,case when @reportselection = 'Y' 
                then day(dateadd(month, datediff(month, -1, @wdate),0) - 1) 
                else case when @reportselection in ('M','W') 
                          then 1.0 
                          else 1.0 / 96.0 
                          end 
                end as dayparts
 
   If @reportselection not in ('Y','M','W') set @wdate = dateadd(minute,15,@wdate) 
   If @reportselection in     ('Y')         set @wdate = dateadd(month,1,@wdate) 
   If @reportselection in     ('M','W')     set @wdate = dateadd(day,1,@wdate) 

end

-- now create our temp report data extract from main history table

IF OBJECT_ID('tempdb..#ReportData','U') IS NOT NULL DROP TABLE #ReportData  -- should not be needed
 
CREATE TABLE #ReportData
(
    ID int identity primary key,
    TimeStamp DATETIME,
    Hundredths INT,      
    Data NUMERIC(12, 4),
    DataChange NUMERIC(12, 4),
    DateStamp datetime,
    CalendarDate datetime,
    Occupancy int,
    OAT numeric (12,4),
    SpaceTemp numeric (12,4)

)

-- populate our report data

INSERT INTO #ReportData (Timestamp, Hundredths, Data, Datestamp, CalendarDate, Occupancy, OAT, SpaceTemp)
SELECT Timestamp, Hundredths, convert(real,Data) * @grate*@calc 
       ,case when @reportselection = 'Y' then convert(char(6),timestamp,112)+'01' else convert(char(8),timestamp,112) end
       ,convert(char(8),timestamp,112)
       ,case when @occtl = 0 then 2 else (SELECT top 1 convert(int,data) 
                 FROM HistorianDB.dbo.TLData O
                 WHERE O.TLInstance = @Occtl
                 AND O.timestamp <= M.timestamp
                 AND O.Type=0 
                 AND isnumeric(O.data) = 1
                 ORDER BY O.timestamp desc) end
       ,(SELECT top 1 convert(real,data) 
         FROM HistorianDB.dbo.TLData O
         WHERE O.TLInstance = @OatTl
         AND O.timestamp between @sdate and M.timestamp
         AND O.Type=0 
         AND isnumeric(O.data) = 1
         ORDER BY O.timestamp desc)
       ,(SELECT top 1 convert(real,data) 
         FROM HistorianDB.dbo.TLData O
         WHERE O.TLInstance = @SpaceTl
         AND O.timestamp between @sdate and M.timestamp
         AND O.Type=0 
         AND isnumeric(O.data) = 1
         ORDER BY O.timestamp desc)

FROM HistorianDB.dbo.TLData M
WHERE TLInstance = @Meter 
AND m.timestamp between @sdate and @edate
AND m.Type=0 
AND isnumeric(data)=1
ORDER BY m.timestamp      

-- do the update our data to calculate the difference between meter readings and cost

UPDATE #reportdata
SET    DataChange = #reportdata.Data - isnull(prev.data,#reportdata.Data)
FROM   #ReportData 
left outer join #reportdata prev on #reportdata.id - 1 = prev.id

-- now check for negative numbers dues to meter reset (or tampering)

UPDATE #reportdata  SET  DataChange = 0 WHERE DataChange < 0

-- seems to help if we create indexes for the larger date ranges

create index idx_time_reportdata on #reportdata (datestamp,datachange desc,timestamp desc)
create index idx_date_reportdata on #reportdata (calendardate,oat)

-- now we can run the report

SELECT 
    PSname as Date,psdate,pedate,
    SUM(isnull(r1.datachange,0)) AS day_data,
    SUM(isnull(r1.datachange,0))*@CO2 AS CO2_data,
    SUM(isnull(r1.datachange,0))*@Carbon AS Carbon_data,
    SUM(isnull(r1.datachange,0)*chargerate) AS DataCost,
    SUM(isnull(r1.datachange,0)*levy) as TotalLevy,
    max(case when r1.datestamp is null then 0 else caprate end)/@intervals  as TotalCap,                           
    max(case when r1.datestamp is null then 0 else chargecap end)  as ChargeCap,
    max(case when r1.datestamp is null then 0 else dayparts * StandingDayCharge end) as StandingDayCharge,
    max(case when r1.datestamp is null then 0 else dayparts * StandingDayCharge end) 
    + SUM(isnull(r1.datachange,0)*(chargerate + levy)) 
    + max(case when r1.datestamp is null then 0 else caprate end)/@intervals  as TotalCost,
    isnull(((case when @reportselection = 'D' then SUM(isnull(r1.datachange,0)) else (select top 1 datachange from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc) end)*4)/@pf,0) as MaxkVA,
    isnull(((case when @reportselection = 'D' then SUM(isnull(r1.datachange,0)) else (select top 1 datachange from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc) end)*4),0) as MaxkW,
    (select top 1 timestamp from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc,r2.timestamp desc) as MaxDataChangeTimeStamp,
    sum(case when isnull(occupancy,2) = 0 then isnull(r1.datachange,0) else 0 end) as OffHoursTotal,
    sum(case when isnull(occupancy,2) = 1 then isnull(r1.datachange,0) else 0 end) as OnHoursTotal,
    avg(isnull(r1.Spacetemp,000.0)) as AvgSpaceTemp,
    case when @reportselection = 'D' or min(isnull(r1.Spacetemp,999.9)) = 999.9 then 0.0 else min(isnull(r1.Spacetemp,999.9)) end as MinSpaceTemp,
    case when @reportselection = 'D' then 0.0 else max(isnull(r1.Spacetemp,000.0)) end as MaxSpaceTemp,
    avg(isnull(r1.oat,000.0)) as AvgOat,
    case when @reportselection = 'D' or min(isnull(r1.Oat,999.9)) = 999.9 then 0.0 else min(isnull(r1.Oat,999.9)) end as MinOat,
    case when @reportselection = 'D' then 0.0 else max(isnull(r1.oat,000.0)) end as MaxOat,
    (select isnull(sum(dd),0) 
   from (select case when avg(isnull(r2.oat,000.0)) < 15.5 
                     then 15.5 - avg(isnull(r2.oat,000.0)) 
                     else 0.0 end as DD
         from #reportdata r2 
         where convert(char(8),r2.timestamp,112) between convert(char(8),psdate,112) and convert(char(8),pedate,112) 
         group by convert(char(8),r2.timestamp,112)) dr2) as DegreeDays,
    @edate as end_date_range 

FROM #Periods
LEFT OUTER JOIN #ReportData r1 on r1.timestamp between psdate and pedate
cross apply dbo.udf_getrates_3(r1.Timestamp,@util)    
GROUP BY PSname,r1.datestamp,PSdate,pedate
ORDER BY PSdate

END

Open in new window

0
Comment
Question by:wint100
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 11
  • 6
17 Comments
 
LVL 51

Accepted Solution

by:
Mark Wills earned 2000 total points
ID: 33652844
Well, if datachange is now available from tldata rather than calculating it, then all we need to do is populate our temp table with datachange and remove the update

ie:

ALTER PROCEDURE [dbo].[GetCostPerPeriod] @sDate DATETIME,@Meter int,@GasRate int,@pf float,@util varchar(50),@reportselection char(1),@co2 float,@Carbon float,@calc float, @occtl int, @oattl int, @Spacetl int,@eDate datetime

AS
BEGIN

--A power factor of 0 will throw exception, so this needs to be taken care of

if isnull(@pf,0)=0 set @pf = 1

-- declare and set up some variables based on incoming parameters

DECLARE @PrevVal NUMERIC(12, 4)
declare @grate numeric(12,4)
declare @wdate datetime
declare @intervals int

 if isnull(@edate,'19000101') > '19000101' set @sdate = convert(char(8),@sdate,112)   -- make sure time is at "start" of period
 if (isnull(@edate,'19000101') < '19000102'  and @reportselection = 'M') set @sdate = DATEADD(month, DATEDIFF(month, 0, @sdate),0)
 if (isnull(@edate,'19000101') < '19000102' and @reportselection = 'Y') set @sdate = datename(year,@sdate) + '0101'



if isnull(@edate,'19000101') > '19000101' set @edate = convert(char(8),@edate,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection not in ('Y','M','W') set @edate = convert(char(8),@sdate,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'W' set @edate = convert(char(8),@sdate+6,112) + ' 23:59:59.997'
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'M' set @edate = DATEADD(month, DATEDIFF(month, -1, @sdate),0) - 0.00000005
If isnull(@edate,'19000101') < '19000102' and @reportselection = 'Y' set @edate = datename(year,@sdate) + '1231 23:59:59.997'

set @intervals = day(dateadd(month,datediff(month,-1,@sdate),0) - 1)
If @reportselection not in ('Y','M','W') set @intervals = @intervals * 96  -- 96 quarter hours in a day.
If @reportselection = 'Y' set @intervals = 1

if isnull(@gasrate,0) = 1
   set @grate = 11.0786 -- 1.02264 * 39 / 3.6
else
   set @grate = 1.0


-- Now create our "periods" table to ensure all reporting date / times are being covered in the granularity required.

if object_id('tempdb..#periods','U') is not null drop table #periods
create table #periods (PSDate datetime, PEDate datetime, PSName varchar(30), DayParts decimal(12,6))

set @wdate = @sdate

While @wdate <=  @edate
begin

   insert #periods (PSDate, PEDate, PSName, DayParts)

   select @wdate as PSDate
          ,case when @reportselection = 'Y'
                then DATEADD(month, DATEDIFF(month, -1, @wdate),0) - 0.00000005
                else case when @reportselection in ('M','W')
                          then convert(char(8),@wdate,112) + ' 23:59:59.997'
                          else dateadd(minute,15,@wdate) - 0.00000005
                          end
                end as PEDate
          ,case when @reportselection = 'Y'
                then left(datename(month,@wdate),3) + ' ' + datename(year,@wdate)
                else case when @reportselection in ('M','W')
                          then left(datename(dw,@wdate),03) + ' ' + convert(char(8),@wdate,03)
                          else left(datename(dw,@wdate),03) + ' ' + convert(char(8),@wdate,03) +right(convert(char(19),@wdate,100),8)  
                          end
                end as PSName
          ,case when @reportselection = 'Y'
                then day(dateadd(month, datediff(month, -1, @wdate),0) - 1)
                else case when @reportselection in ('M','W')
                          then 1.0
                          else 1.0 / 96.0
                          end
                end as dayparts
 
   If @reportselection not in ('Y','M','W') set @wdate = dateadd(minute,15,@wdate)
   If @reportselection in     ('Y')         set @wdate = dateadd(month,1,@wdate)
   If @reportselection in     ('M','W')     set @wdate = dateadd(day,1,@wdate)

end

-- now create our temp report data extract from main history table

IF OBJECT_ID('tempdb..#ReportData','U') IS NOT NULL DROP TABLE #ReportData  -- should not be needed
 
CREATE TABLE #ReportData
(
    ID int identity primary key,
    TimeStamp DATETIME,
    Hundredths INT,      
    Data NUMERIC(12, 4),
    DataChange NUMERIC(12, 4),
    DateStamp datetime,
    CalendarDate datetime,
    Occupancy int,
    OAT numeric (12,4),
    SpaceTemp numeric (12,4)

)

-- populate our report data

INSERT INTO #ReportData (Timestamp, Hundredths, DataChange, Datestamp, CalendarDate, Occupancy, OAT, SpaceTemp)
SELECT Timestamp, Hundredths, convert(real,DataChange) * @grate*@calc
       ,case when @reportselection = 'Y' then convert(char(6),timestamp,112)+'01' else convert(char(8),timestamp,112) end
       ,convert(char(8),timestamp,112)
       ,case when @occtl = 0 then 2 else (SELECT top 1 convert(int,data)
                 FROM HistorianDB.dbo.TLData O
                 WHERE O.TLInstance = @Occtl
                 AND O.timestamp <= M.timestamp
                 AND O.Type=0
                 AND isnumeric(O.data) = 1
                 ORDER BY O.timestamp desc) end
       ,(SELECT top 1 convert(real,data)
         FROM HistorianDB.dbo.TLData O
         WHERE O.TLInstance = @OatTl
         AND O.timestamp between @sdate and M.timestamp
         AND O.Type=0
         AND isnumeric(O.data) = 1
         ORDER BY O.timestamp desc)
       ,(SELECT top 1 convert(real,data)
         FROM HistorianDB.dbo.TLData O
         WHERE O.TLInstance = @SpaceTl
         AND O.timestamp between @sdate and M.timestamp
         AND O.Type=0
         AND isnumeric(O.data) = 1
         ORDER BY O.timestamp desc)

FROM HistorianDB.dbo.TLData M
WHERE TLInstance = @Meter
AND m.timestamp between @sdate and @edate
AND m.Type=0
AND isnumeric(data)=1
ORDER BY m.timestamp      

-- do the update our data to calculate the difference between meter readings and cost
--
--UPDATE #reportdata
--SET    DataChange = #reportdata.Data - isnull(prev.data,#reportdata.Data)
--FROM   #ReportData
--left outer join #reportdata prev on #reportdata.id - 1 = prev.id

-- now check for negative numbers dues to meter reset (or tampering)

--UPDATE #reportdata  SET  DataChange = 0 WHERE DataChange < 0

-- seems to help if we create indexes for the larger date ranges

create index idx_time_reportdata on #reportdata (datestamp,datachange desc,timestamp desc)
create index idx_date_reportdata on #reportdata (calendardate,oat)

-- now we can run the report

SELECT
    PSname as Date,psdate,pedate,
    SUM(isnull(r1.datachange,0)) AS day_data,
    SUM(isnull(r1.datachange,0))*@CO2 AS CO2_data,
    SUM(isnull(r1.datachange,0))*@Carbon AS Carbon_data,
    SUM(isnull(r1.datachange,0)*chargerate) AS DataCost,
    SUM(isnull(r1.datachange,0)*levy) as TotalLevy,
    max(case when r1.datestamp is null then 0 else caprate end)/@intervals  as TotalCap,                          
    max(case when r1.datestamp is null then 0 else chargecap end)  as ChargeCap,
    max(case when r1.datestamp is null then 0 else dayparts * StandingDayCharge end) as StandingDayCharge,
    max(case when r1.datestamp is null then 0 else dayparts * StandingDayCharge end)
    + SUM(isnull(r1.datachange,0)*(chargerate + levy))
    + max(case when r1.datestamp is null then 0 else caprate end)/@intervals  as TotalCost,
    isnull(((case when @reportselection = 'D' then SUM(isnull(r1.datachange,0)) else (select top 1 datachange from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc) end)*4)/@pf,0) as MaxkVA,
    isnull(((case when @reportselection = 'D' then SUM(isnull(r1.datachange,0)) else (select top 1 datachange from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc) end)*4),0) as MaxkW,
    (select top 1 timestamp from #reportdata r2 where r2.datestamp = r1.datestamp order by r2.datachange desc,r2.timestamp desc) as MaxDataChangeTimeStamp,
    sum(case when isnull(occupancy,2) = 0 then isnull(r1.datachange,0) else 0 end) as OffHoursTotal,
    sum(case when isnull(occupancy,2) = 1 then isnull(r1.datachange,0) else 0 end) as OnHoursTotal,
    avg(isnull(r1.Spacetemp,000.0)) as AvgSpaceTemp,
    case when @reportselection = 'D' or min(isnull(r1.Spacetemp,999.9)) = 999.9 then 0.0 else min(isnull(r1.Spacetemp,999.9)) end as MinSpaceTemp,
    case when @reportselection = 'D' then 0.0 else max(isnull(r1.Spacetemp,000.0)) end as MaxSpaceTemp,
    avg(isnull(r1.oat,000.0)) as AvgOat,
    case when @reportselection = 'D' or min(isnull(r1.Oat,999.9)) = 999.9 then 0.0 else min(isnull(r1.Oat,999.9)) end as MinOat,
    case when @reportselection = 'D' then 0.0 else max(isnull(r1.oat,000.0)) end as MaxOat,
    (select isnull(sum(dd),0)
   from (select case when avg(isnull(r2.oat,000.0)) < 15.5
                     then 15.5 - avg(isnull(r2.oat,000.0))
                     else 0.0 end as DD
         from #reportdata r2
         where convert(char(8),r2.timestamp,112) between convert(char(8),psdate,112) and convert(char(8),pedate,112)
         group by convert(char(8),r2.timestamp,112)) dr2) as DegreeDays,
    @edate as end_date_range

FROM #Periods
LEFT OUTER JOIN #ReportData r1 on r1.timestamp between psdate and pedate
cross apply dbo.udf_getrates_3(r1.Timestamp,@util)    
GROUP BY PSname,r1.datestamp,PSdate,pedate
ORDER BY PSdate

END



-- the other thing I think would be worthwhile trying is bypassing having to build the temp report data in the first instance - ie run the query directly from #periods linking to tldata... Will check that out...

0
 
LVL 1

Author Comment

by:wint100
ID: 33652855
Excellent, will test out. Cheers
0
 
LVL 1

Author Comment

by:wint100
ID: 33653706
Forgot how long this SP has gotten!

Is it easy enough to add an int (say @isAccumulating), when 0 then we already have datachange, when 1, we run as previous. (is, having to calc the datachange in the SP)

If not then no worries, just a thought for transitioning over to new format.
0
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

 
LVL 51

Expert Comment

by:Mark Wills
ID: 33653798
Hmmm... yes we could do that.

Will need to keep a log of those tlinstances that have changed, and if reusing the "DATA" column, then the insert from tldata must use the DATA column to populate datachange column in the reportdata temp table.

Then the update, instead of commenting out, becomes :

UPDATE #reportdata
SET    DataChange = #reportdata.Data - isnull(prev.data,#reportdata.Data)
FROM   #ReportData
left outer join #reportdata prev on #reportdata.id - 1 = prev.id
where @isaccumulating = 1  -- ie @meter is still using the "old way"

-- Or if you have a table to identify which of the tlinstances have changed, then could use that as a lookup instead of @isaccumulating e.g.

if not exists  (select * from my_log_of_converted_instances where tlinstance = @meter)
   UPDATE #reportdata
   SET    DataChange = #reportdata.Data - isnull(prev.data,#reportdata.Data)
   FROM   #ReportData
   left outer join #reportdata prev on #reportdata.id - 1 = prev.id

-- then you dont have to worry about it, just bung an entry into your log after each conversion
0
 
LVL 1

Author Comment

by:wint100
ID: 33653859
I was just think of putting a selection in the app, so the I can select if the type of logging (Acummulating or datachange), this would then pass the parameter of @isAccumulating to suit, so the SP knows what to do.
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 33656771
Yep, in which case it is the earlier version, or better...

if @isaccumulating = 1
   UPDATE #reportdata
   SET    DataChange = #reportdata.Data - isnull(prev.data,#reportdata.Data)
   FROM   #ReportData
   left outer join #reportdata prev on #reportdata.id - 1 = prev.id
0
 
LVL 1

Author Comment

by:wint100
ID: 33656820
Cheers, just noticed this comment:

-- the other thing I think would be worthwhile trying is bypassing having to build the temp report data in the first instance - ie run the query directly from #periods linking to tldata... Will check that out...
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 33657065
Will depend on the indexes. There are a few inline lookups to contend with, and, if we need to accommodate both ways, then we cannot do the straight query. But once all instances have been converted across then we might be able to check it out...

0
 
LVL 1

Author Comment

by:wint100
ID: 33657075
Cheers, will is make much difference to speed?
0
 
LVL 1

Author Comment

by:wint100
ID: 33657150
Also, if the Accumulating requirement is dropped, is this still needed, as we just sum up whats is in the TLData.Data column, right?

If @reportselection not in ('Y','M','W') set @intervals = @intervals * 96  -- 96 quarter hours in a day.

Couldn't we just do 'sum(Data) where timestamp between sDate and eDate' or something to that effect.
0
 
LVL 1

Author Comment

by:wint100
ID: 33657168
Back again, finally got to test the above, took 13s to run on 10days, and returned some odd VERY High values. See attached.. I did have to change this to get it working :

INSERT INTO #ReportData (Timestamp, Hundredths, DataChange, Datestamp, CalendarDate, Occupancy, OAT, SpaceTemp)
SELECT Timestamp, Hundredths, convert(real,Data) * @grate*@calc

from


INSERT INTO #ReportData (Timestamp, Hundredths, DataChange, Datestamp, CalendarDate, Occupancy, OAT, SpaceTemp)
SELECT Timestamp, Hundredths, convert(real,Datachange) * @grate*@calc
Book1.xlsx
0
 
LVL 1

Author Comment

by:wint100
ID: 33657211
Just a note, I was using sDate and eDate for 'M' report..
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 33658443
Yes, I did mention having to change datachange back to data in one of the postings above :)

Timing shouldnt be any different from before really... Will check it out...

As for doing the straight link (after all the migration has been done to get the "change" in the "data" column) then yes, it is very tempting to do the straight query. But if the indexes do not support how we need to extract the data, then you are accessing a table with tens of millions of rows.
0
 
LVL 1

Author Comment

by:wint100
ID: 33659487
Missed that..

Even with the change to data, i get the results in the attached spreadsheet. Its returning huge values and taking a long time to complete.
0
 
LVL 1

Author Comment

by:wint100
ID: 33691237
Just re-visited this and found I was running against a TLInstance where each row had a value of 35000000, so thats why I had high values, but it really slowed things down.

With correct data, it completed in under 1s.

Cheers
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 33691779
Also be aware that while the "migration" is on from "raw meter readings" to incremental changes between readings then the DATA field must get updated not just that temporary column we had been playing with.

Also noted that we are inserting "data" into the "datachange" column in #reportdata, but the update is trying to use "data" which is not being populated... So, that update really needs to be :


if @isaccumulating = 1
   UPDATE #reportdata
   SET    DataChange = #reportdata.Datachange - isnull(prev.datachange,#reportdata.Datachange)
   FROM   #ReportData
   left outer join #reportdata prev on #reportdata.id - 1 = prev.id

Does that make any sense ?
0
 
LVL 1

Author Comment

by:wint100
ID: 33699770
Yes, perfect sense.

Thanks
0

Featured Post

Veeam Disaster Recovery in Microsoft Azure

Veeam PN for Microsoft Azure is a FREE solution designed to simplify and automate the setup of a DR site in Microsoft Azure using lightweight software-defined networking. It reduces the complexity of VPN deployments and is designed for businesses of ALL sizes.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Naughty Me. While I was changing the database name from DB1 to DB_PROD1 (yep it's not real database name ^v^), I changed the database name and notified my application fellows that I did it. They turn on the application, and everything is working. A …
In this article I will describe the Detach & Attach method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
In this video, Percona Director of Solution Engineering Jon Tobin discusses the function and features of Percona Server for MongoDB. How Percona can help Percona can help you determine if Percona Server for MongoDB is the right solution for …
Is your data getting by on basic protection measures? In today’s climate of debilitating malware and ransomware—like WannaCry—that may not be enough. You need to establish more than basics, like a recovery plan that protects both data and endpoints.…

618 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question