ms sql report sum

see attached.

The use will give me selected month like 11/2015 and I am able to get total tran amount group by date.
How can I calcaulate average, total, and buiz day? assume weekend and holiday has total tran count is zero.
2.xls
LVL 1
ITsolutionWizardAsked:
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

x
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Brian CroweDatabase AdministratorCommented:
I am a little unclear on what you are asking for.  Are you looking for a SQL query or help with SSRS?  If the former then we need schema detail.
ITsolutionWizardAuthor Commented:
I am looking sql query
ITsolutionWizardAuthor Commented:
just like

datecreated 11/01/2015
totaltrancount 800

that's all.

and I need to get average and total
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

PortletPaulEE Topic AdvisorCommented:
If a question has only a small amount of information and/or the information is vague, then we can only provide general advice and cannot be specific.

For example I could not propose a thorough query solution because:
>>I don't know the table name(s) or column names.
>>There is no sample data to populate tables.

So, all we have is what I presume is the "expected result" where column F is a "rolling averaage" and column G is a "running sum" of the count in column E. It appears that "buiz days" excludes Saturday & Sundayexpected result?.

So, generating some test data:
CREATE TABLE Table1
    ([SomeColumn] datetime)
;
    
INSERT INTO Table1
    ([SomeColumn])
VALUES
    ('2015-11-01 00:00:00'),
    ('2015-11-01 00:00:00'),
    ('2015-11-01 00:00:00'),
    ('2015-11-01 00:00:00'),
    ('2015-11-01 00:00:00'),
    ('2015-11-01 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-02 00:00:00'),
    ('2015-11-03 00:00:00'),
    ('2015-11-03 00:00:00'),
    ('2015-11-03 00:00:00'),
    ('2015-11-07 00:00:00')
;

Open in new window

Using this "generic" query:
select
      d.*
      , sum(some_avg_calc) over(order by dateColumn) as x
      , sum(count_all) over(order by dateColumn) as y
      , sum(count_bus) over(order by dateColumn) as z
from (
      select
              cast(SomeColumn as date) as DateColumn
            , avg(ABS(CHECKSUM(NewId())) % 14)  as some_avg_calc
            , count(*) count_all
            , count(case when datediff(day,0,SomeColumn) % 7 < 5 then 1 end) count_bus
      from table1
      group by
            cast(SomeColumn as date)
     ) as d
order by
      DateColumn

Open in new window

I can produce this result:
| DateColumn | some_avg_calc | count_all | count_bus |  x |  y |  z |
|------------|---------------|-----------|-----------|----|----|----|
| 2015-11-01 |             4 |         6 |         0 |  4 |  6 |  0 |
| 2015-11-02 |             4 |         9 |         9 |  8 | 15 |  9 |
| 2015-11-03 |             8 |         3 |         3 | 16 | 18 | 12 |
| 2015-11-07 |            12 |         1 |         0 | 28 | 19 | 12 |
        

Open in new window

NOte:
datediff(day,0,SomeColumn) % 7 < 5 this will identify "business days" as a remainder of zero to 4 is Monday to Friday (5 is Sat, 6 is Sun). This approach works regardless of server settings.
ITsolutionWizardAuthor Commented:
I have the database table design attached with this message.
And the average and total is what I hope to generate.
actually, I can add two more columns called average and total if it is easier for you.

Thanks1
PortletPaulEE Topic AdvisorCommented:
Is that image your db design?
Is it a table? Or a query result?

Do you intend for us to transcribe that data?

When you do the running sum where does it start or restart?
ITsolutionWizardAuthor Commented:
It is a table
Brian CroweDatabase AdministratorCommented:
This is what we need you to provide to facilitate answering your question:

The idea here is to lower the barrier to entry for experts that would like to help but have jobs of their own and don't want to exchange 15 comments with the original poster to obtain information that should have been provided.

Schema information
Sample data preferably scripted
Sample output preferably based on the sample input but not necessarily
An explanation of the logic applied to transition from one to the other if it isn't intuitive by looking at the output

The first two of these can be obtained rather simply utilizing functionality built into SSMS.

http://blog.sqlauthority.com/2012/07/18/sql-server-generate-script-for-schema-and-data-sql-in-sixty-seconds-021-video/
ITsolutionWizardAuthor Commented:
Brian,

First of all, thank you for your time helping me. But I think my question is very straight forward.
I have one table with two columns and data is listed as well.

I just hope to generate two more columns which is average and total.

That's all I need. And the sample is also listed with this question.
PortletPaulEE Topic AdvisorCommented:
You have IMAGES of data, but not reusable data.
The "expected result" appears in the supplied spreadsheet, and from that I could reverse engineer the "data"
It was not until I saw the image of the data that I knew that was true however, and nor did I know the column names until then.
I still don't know the table name.

OK, So I can go the extra step, here is some reusable data:
ID	Date	recordingCount
1	2015-11-01 00:00:00	0
2	2015-11-02 00:00:00	800
3	2015-11-03 00:00:00	1000
4	2015-11-04 00:00:00	600
5	2015-11-05 00:00:00	1200
6	2015-11-06 00:00:00	500
7	2015-11-07 00:00:00	0
8	2015-11-08 00:00:00	0
9	2015-11-09 00:00:00	1000
10	2015-11-10 00:00:00	600
11	2015-11-11 00:00:00	750
12	2015-11-12 00:00:00	850
13	2015-11-13 00:00:00	1300
14	2015-11-14 00:00:00	0
15	2015-11-15 00:00:00	0
16	2015-11-16 00:00:00	2000
17	2015-11-17 00:00:00	1500
18	2015-11-18 00:00:00	700
19	2015-11-19 00:00:00	600
20	2015-11-20 00:00:00	1000
21	2015-11-21 00:00:00	0
22	2015-11-22 00:00:00	0
23	2015-11-23 00:00:00	400
24	2015-11-24 00:00:00	950
25	2015-11-25 00:00:00	1000
26	2015-11-26 00:00:00	0
27	2015-11-27 00:00:00	0
28	2015-11-28 00:00:00	0
29	2015-11-29 00:00:00	0
30	2015-11-30 00:00:00	1000

Open in new window

Of course I still need to actually get that into a table, so now I have to form inserts. Well ok I can do that too I guess. Here you go. I have chosen a table name to do this of course.
CREATE TABLE Table1
    ([ID] int, [Date] datetime, [recordingCount] int)
;
    
INSERT INTO Table1
    ([ID], [Date], [recordingCount])
VALUES
    (1, '2015-11-01 00:00:00', 0),
    (2, '2015-11-02 00:00:00', 800),
    (3, '2015-11-03 00:00:00', 1000),
    (4, '2015-11-04 00:00:00', 600),
    (5, '2015-11-05 00:00:00', 1200),
    (6, '2015-11-06 00:00:00', 500),
    (7, '2015-11-07 00:00:00', 0),
    (8, '2015-11-08 00:00:00', 0),
    (9, '2015-11-09 00:00:00', 1000),
    (10, '2015-11-10 00:00:00', 600),
    (11, '2015-11-11 00:00:00', 750),
    (12, '2015-11-12 00:00:00', 850),
    (13, '2015-11-13 00:00:00', 1300),
    (14, '2015-11-14 00:00:00', 0),
    (15, '2015-11-15 00:00:00', 0),
    (16, '2015-11-16 00:00:00', 2000),
    (17, '2015-11-17 00:00:00', 1500),
    (18, '2015-11-18 00:00:00', 700),
    (19, '2015-11-19 00:00:00', 600),
    (20, '2015-11-20 00:00:00', 1000),
    (21, '2015-11-21 00:00:00', 0),
    (22, '2015-11-22 00:00:00', 0),
    (23, '2015-11-23 00:00:00', 400),
    (24, '2015-11-24 00:00:00', 950),
    (25, '2015-11-25 00:00:00', 1000),
    (26, '2015-11-26 00:00:00', 0),
    (27, '2015-11-27 00:00:00', 0),
    (28, '2015-11-28 00:00:00', 0),
    (29, '2015-11-29 00:00:00', 0),
    (30, '2015-11-30 00:00:00', 1000)
;

Open in new window

Alrighty. So now the "very straight forward" extra columns: Well now if I look back at the expected result I see that the "total" is actually a "running sum", well there is a function for that. SUM(...) OVER(ORDER BY ...)

select
    *
    , SUM(recordingCount) OVER(ORDER BY ID) as Total
from table1


| ID |                       Date | recordingCount | Total |
|----|----------------------------|----------------|-------|
|  1 | November, 01 2015 00:00:00 |              0 |     0 |
|  2 | November, 02 2015 00:00:00 |            800 |   800 |
|  3 | November, 03 2015 00:00:00 |           1000 |  1800 |
|  4 | November, 04 2015 00:00:00 |            600 |  2400 |
|  5 | November, 05 2015 00:00:00 |           1200 |  3600 |
|  6 | November, 06 2015 00:00:00 |            500 |  4100 |
|  7 | November, 07 2015 00:00:00 |              0 |  4100 |
|  8 | November, 08 2015 00:00:00 |              0 |  4100 |
|  9 | November, 09 2015 00:00:00 |           1000 |  5100 |
| 10 | November, 10 2015 00:00:00 |            600 |  5700 |
| 11 | November, 11 2015 00:00:00 |            750 |  6450 |
| 12 | November, 12 2015 00:00:00 |            850 |  7300 |
| 13 | November, 13 2015 00:00:00 |           1300 |  8600 |
| 14 | November, 14 2015 00:00:00 |              0 |  8600 |
| 15 | November, 15 2015 00:00:00 |              0 |  8600 |
| 16 | November, 16 2015 00:00:00 |           2000 | 10600 |
| 17 | November, 17 2015 00:00:00 |           1500 | 12100 |
| 18 | November, 18 2015 00:00:00 |            700 | 12800 |
| 19 | November, 19 2015 00:00:00 |            600 | 13400 |
| 20 | November, 20 2015 00:00:00 |           1000 | 14400 |
| 21 | November, 21 2015 00:00:00 |              0 | 14400 |
| 22 | November, 22 2015 00:00:00 |              0 | 14400 |
| 23 | November, 23 2015 00:00:00 |            400 | 14800 |
| 24 | November, 24 2015 00:00:00 |            950 | 15750 |
| 25 | November, 25 2015 00:00:00 |           1000 | 16750 |
| 26 | November, 26 2015 00:00:00 |              0 | 16750 |
| 27 | November, 27 2015 00:00:00 |              0 | 16750 |
| 28 | November, 28 2015 00:00:00 |              0 | 16750 |
| 29 | November, 29 2015 00:00:00 |              0 | 16750 |
| 30 | November, 30 2015 00:00:00 |           1000 | 17750 |
        

Open in new window

Great... oh, hold on that doesn't look like the image of the wanted result.... oh perhaps there is some undocumented condition to be met, let's see if I scan the data it looks like they may want to suppress that running sum if he existing daily count is zero. Looks reasonable so I'll assume this is an unstated requirements. Here goes.
select
    *
    , case when recordingCount > 0 then SUM(recordingCount) OVER(ORDER BY ID)
        else 0
      end as Total
from table1


| ID |                       Date | recordingCount | Total |
|----|----------------------------|----------------|-------|
|  1 | November, 01 2015 00:00:00 |              0 |     0 |
|  2 | November, 02 2015 00:00:00 |            800 |   800 |
|  3 | November, 03 2015 00:00:00 |           1000 |  1800 |
|  4 | November, 04 2015 00:00:00 |            600 |  2400 |
|  5 | November, 05 2015 00:00:00 |           1200 |  3600 |
|  6 | November, 06 2015 00:00:00 |            500 |  4100 |
|  7 | November, 07 2015 00:00:00 |              0 |     0 |
|  8 | November, 08 2015 00:00:00 |              0 |     0 |
|  9 | November, 09 2015 00:00:00 |           1000 |  5100 |
| 10 | November, 10 2015 00:00:00 |            600 |  5700 |
| 11 | November, 11 2015 00:00:00 |            750 |  6450 |
| 12 | November, 12 2015 00:00:00 |            850 |  7300 |
| 13 | November, 13 2015 00:00:00 |           1300 |  8600 |
| 14 | November, 14 2015 00:00:00 |              0 |     0 |
| 15 | November, 15 2015 00:00:00 |              0 |     0 |
| 16 | November, 16 2015 00:00:00 |           2000 | 10600 |
| 17 | November, 17 2015 00:00:00 |           1500 | 12100 |
| 18 | November, 18 2015 00:00:00 |            700 | 12800 |
| 19 | November, 19 2015 00:00:00 |            600 | 13400 |
| 20 | November, 20 2015 00:00:00 |           1000 | 14400 |
| 21 | November, 21 2015 00:00:00 |              0 |     0 |
| 22 | November, 22 2015 00:00:00 |              0 |     0 |
| 23 | November, 23 2015 00:00:00 |            400 | 14800 |
| 24 | November, 24 2015 00:00:00 |            950 | 15750 |
| 25 | November, 25 2015 00:00:00 |           1000 | 16750 |
| 26 | November, 26 2015 00:00:00 |              0 |     0 |
| 27 | November, 27 2015 00:00:00 |              0 |     0 |
| 28 | November, 28 2015 00:00:00 |              0 |     0 |
| 29 | November, 29 2015 00:00:00 |              0 |     0 |
| 30 | November, 30 2015 00:00:00 |           1000 | 17750 |

Open in new window

Mmmm looking good, so what about the other "straight forward" column, it is the "average" , well there is an average function let's see what this does when also combined with OVER()
select
    *
    , case when recordingCount > 0 then AVG(recordingCount) OVER(ORDER BY ID)
        else 0
      end as Average
    , case when recordingCount > 0 then SUM(recordingCount) OVER(ORDER BY ID)
        else 0
      end as Total
from table1


| ID |                       Date | recordingCount | Average | Total |
|----|----------------------------|----------------|---------|-------|
|  1 | November, 01 2015 00:00:00 |              0 |       0 |     0 |
|  2 | November, 02 2015 00:00:00 |            800 |     400 |   800 |
|  3 | November, 03 2015 00:00:00 |           1000 |     600 |  1800 |
|  4 | November, 04 2015 00:00:00 |            600 |     600 |  2400 |
|  5 | November, 05 2015 00:00:00 |           1200 |     720 |  3600 |
|  6 | November, 06 2015 00:00:00 |            500 |     683 |  4100 |
|  7 | November, 07 2015 00:00:00 |              0 |       0 |     0 |
|  8 | November, 08 2015 00:00:00 |              0 |       0 |     0 |
|  9 | November, 09 2015 00:00:00 |           1000 |     566 |  5100 |
| 10 | November, 10 2015 00:00:00 |            600 |     570 |  5700 |
| 11 | November, 11 2015 00:00:00 |            750 |     586 |  6450 |
| 12 | November, 12 2015 00:00:00 |            850 |     608 |  7300 |
| 13 | November, 13 2015 00:00:00 |           1300 |     661 |  8600 |
| 14 | November, 14 2015 00:00:00 |              0 |       0 |     0 |
| 15 | November, 15 2015 00:00:00 |              0 |       0 |     0 |
| 16 | November, 16 2015 00:00:00 |           2000 |     662 | 10600 |
| 17 | November, 17 2015 00:00:00 |           1500 |     711 | 12100 |
| 18 | November, 18 2015 00:00:00 |            700 |     711 | 12800 |
| 19 | November, 19 2015 00:00:00 |            600 |     705 | 13400 |
| 20 | November, 20 2015 00:00:00 |           1000 |     720 | 14400 |
| 21 | November, 21 2015 00:00:00 |              0 |       0 |     0 |
| 22 | November, 22 2015 00:00:00 |              0 |       0 |     0 |
| 23 | November, 23 2015 00:00:00 |            400 |     643 | 14800 |
| 24 | November, 24 2015 00:00:00 |            950 |     656 | 15750 |
| 25 | November, 25 2015 00:00:00 |           1000 |     670 | 16750 |
| 26 | November, 26 2015 00:00:00 |              0 |       0 |     0 |
| 27 | November, 27 2015 00:00:00 |              0 |       0 |     0 |
| 28 | November, 28 2015 00:00:00 |              0 |       0 |     0 |
| 29 | November, 29 2015 00:00:00 |              0 |       0 |     0 |
| 30 | November, 30 2015 00:00:00 |           1000 |     591 | 17750 |
        

Open in new window

Blimey! Strewth! that average isn't what is expected. How on earth is it calculated.

the story continues now with the requestor recognizing that we are not clairvoyants and steps in with some further assistance.
PortletPaulEE Topic AdvisorCommented:
well, as I was here and giving it some thought:
select
    *
    , cast(SUM(recordingCount) OVER(ORDER BY ID) as decimal(12,3))
      / sum(case when recordingCount > 0 then 1.0 end) over(order by id) 
      as Average
    , case when recordingCount > 0 then SUM(recordingCount) OVER(ORDER BY ID)
        else 0
      end as Total
from table1


| ID |                       Date | recordingCount |            Average | Total |
|----|----------------------------|----------------|--------------------|-------|
|  1 | November, 01 2015 00:00:00 |              0 |             (null) |     0 |
|  2 | November, 02 2015 00:00:00 |            800 |                800 |   800 |
|  3 | November, 03 2015 00:00:00 |           1000 |                900 |  1800 |
|  4 | November, 04 2015 00:00:00 |            600 |                800 |  2400 |
|  5 | November, 05 2015 00:00:00 |           1200 |                900 |  3600 |
|  6 | November, 06 2015 00:00:00 |            500 |                820 |  4100 |
|  7 | November, 07 2015 00:00:00 |              0 |                820 |     0 |
|  8 | November, 08 2015 00:00:00 |              0 |                820 |     0 |
|  9 | November, 09 2015 00:00:00 |           1000 |                850 |  5100 |
| 10 | November, 10 2015 00:00:00 |            600 |  814.2857142857143 |  5700 |
| 11 | November, 11 2015 00:00:00 |            750 |             806.25 |  6450 |
| 12 | November, 12 2015 00:00:00 |            850 |  811.1111111111111 |  7300 |
| 13 | November, 13 2015 00:00:00 |           1300 |                860 |  8600 |
| 14 | November, 14 2015 00:00:00 |              0 |                860 |     0 |
| 15 | November, 15 2015 00:00:00 |              0 |                860 |     0 |
| 16 | November, 16 2015 00:00:00 |           2000 |  963.6363636363636 | 10600 |
| 17 | November, 17 2015 00:00:00 |           1500 | 1008.3333333333334 | 12100 |
| 18 | November, 18 2015 00:00:00 |            700 |  984.6153846153846 | 12800 |
| 19 | November, 19 2015 00:00:00 |            600 |  957.1428571428571 | 13400 |
| 20 | November, 20 2015 00:00:00 |           1000 |                960 | 14400 |
| 21 | November, 21 2015 00:00:00 |              0 |                960 |     0 |
| 22 | November, 22 2015 00:00:00 |              0 |                960 |     0 |
| 23 | November, 23 2015 00:00:00 |            400 |                925 | 14800 |
| 24 | November, 24 2015 00:00:00 |            950 |  926.4705882352941 | 15750 |
| 25 | November, 25 2015 00:00:00 |           1000 |  930.5555555555555 | 16750 |
| 26 | November, 26 2015 00:00:00 |              0 |  930.5555555555555 |     0 |
| 27 | November, 27 2015 00:00:00 |              0 |  930.5555555555555 |     0 |
| 28 | November, 28 2015 00:00:00 |              0 |  930.5555555555555 |     0 |
| 29 | November, 29 2015 00:00:00 |              0 |  930.5555555555555 |     0 |
| 30 | November, 30 2015 00:00:00 |           1000 |  934.2105263157895 | 17750 |

Open in new window

I wonder if this is really "straight forward"?

http://sqlfiddle.com/#!6/6144f/6

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
ITsolutionWizardAuthor Commented:
no matter what...you have the points for sure. I really appreciate your helps. You are the man
PortletPaulEE Topic AdvisorCommented:
>>"no matter what..."
My "story" had an intention. Tossing minimal information into  a question isn't "playing the game" well and it causes delays. We are volunteers most of whom cannot afford the time to reverse engineer (or invent) test data, and if you spend a little more time in question preparation you will save a lot of overall time because we don't need to ask so many questions.

There is a time honoured and well proven method to get similar questions answered quickly and accurately, provide these :

a. "sample data" (with schema if possible)
The data should be "reusable" (nothing is more frustrating than images of data). As Brian identified (ID: 41263644) you can even script this out.
b. "expected results"
This can be an image or spreadsheet tab, but can simply be text as long as it can be read easily.

Important "rules" about any calculations should be written too.

TIPS:
1. Supply the table name(s) as well as column names. It's amazing how many miss doing this and we end up having to invent tables like "table1" "yourtable" etc. ... weird; as it is so much easier for you if we use the correct table name(s).

2. for images it is better to "embed" than to "attach". In the toolbar above the comment edit area, second last from the right, there is an image icon. Using this "embeds" images so they can be seen without having to popup another browser tab. (by the way I did this for you in the images above, so you can now see the impact this has).

3. if using a spreadsheet; use multiple tabs in preference to attaching multiple spreadsheets e.g. one tab per table of sample data plus one for the expected result.

{+edit}
and "sample" (as used in "sample data") also indicates small; i.e. not a lot of data is required.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.