I'm working on an ASP.NET website that's connecting to SQL Server 2005 data. I'm using LINQ to-SQL for data access.
I have the following 3 tables (I've also attached an image of what they look like in a Database Diagram).
The Facility table represents different plants where employees put in shifts. the WorkShift table records the employees start and end dates, total hours worked, and the facility they worked at. Finally, there's a PayPeriod table (pay periods do NOT fall neatly at regular intervals).
My end goal is to display the following (in a gridview) for a specific employee (e.g. FkEmployeeId = 1):
FkFacilityId | StartDate | EndDate | ReqHrs | HrsWorked | Over/Under | Req. Met?
1 | 1/1/2010 | 1/27/2010 | 60 | 45 | -15 | NO
1 | 1/28/2010 | 2/20/2010 | 20 | 25 | 5 | YES
2 | 4/1/2010 | 4/22/2010 | 30 | 30 | 0 | YES
The first 4 columns are from the PayPeriod table. That part's easy.
The part I'm not sure about is calculating the HrsWorked fields, which needs to be a SUM of WorkShift records that match the pay period and facility for that Employee Id. I'm thinking I should use a LINQ SUM clause, but unsure how to do so from the "parent" table's query.
Then, for each row, I'll need to calculate the Over/Under field by subtracting HrsWorked from ReqHours. Then I need to display YES or NO in the "Req. Met?" field, based on whether Over/Under is >= 0.
Is there a relatively clean way to do this in LINQ? I'm tempted to do it using brute-force code in my code-behind, but am hesitant because I assume it'll be both more efficient, plus cleaner (separation of data access from client code) to use LINQ (or T-SQL?).