Databases

54K

Solutions

38K

Contributors

Databases are organized collections of data, most commonly accessed through management systems including schemas, tables, queries and processes that allow users to enter and manipulate the information or utilize it in other fashions, such as with web applications or for reporting purposes.

Share tech news, updates, or what's on your mind.

Sign up to Post

Hi All

( This isn't a question as much as it's a member poll, and it's more of a process question then a code question, so points will be split across any meaningful answer. )

Does anyone have a process (not necessarily code) for auditing a data warehouse?  Specifically, a process or code that does something like this..
  • Compare total sums/counts in source data to sums/counts in target DW?
  • Compare incremental load (day, hour, etc.) sums/counts in source data to sums/counts in target DW?
  • Compare by table/column blank/NULL values in source data to same in DW?
  • (Kinda dreaming here) Use of DMV's to show metrics of how much the DW tables are used?

I've inherited a DW (really a datamart, but that's another story) that users have suspect for awhile now, and I need to nail down a process to audit this DW to verify that the data accurately reflects the source data, and communicate that in a simple way with a few more details than executive red/yellow/green.   Also I have a selfish interest in being able to communicate my progress in resolving these issues clearly.

I tried Googling it myself, and all the results were either SQL Server auditing or consulting companies advertising DW projects.

Thanks in advance.
Jim
0
[Webinar] Learn How Hackers Steal Your Credentials
LVL 9
[Webinar] Learn How Hackers Steal Your Credentials

Do You Know How Hackers Steal Your Credentials? Join us and Skyport Systems to learn how hackers steal your credentials and why Active Directory must be secure to stop them. Thursday, July 13, 2017 10:00 A.M. PDT

ONe of my Exchange 2010  Databases -say named  "SAles"  is 160gb and I moved all mailboxes out from "Sales" to Sales2" database and I ran esseutil /d sales.edb  /p on it I thought it this will significantly reduce Database "sales" size however size did n't change much from 160gb down to 145gb???

can ESEUTIL dramatically reduce DB size ?
0
I am developing an application that needs to connect to an Oracle database. I've added the required Nuget packages to connect to the database. If I have the DBA credentials to a schema, I can connect fine and expand the tables in the schema to add them to my model. However, on another schema I only have permissions to SELECT from a table. So, when I go through the wizard to set up my EF model I can get to the page to select the Tables I want to add to my model. However, I can't get the Tables node to expand in order to select the table that I have SELECT permissions on. The first screenshot show the screen I am stuck on. I Cannot click the check-box or expand the node to show the tables. The second screenshot shows the same behavior in VS2015 Server Explorer in that the node under 'Tables' can't be expanded to show that table I have select access to.

As I said before, there are other schemas where I have DBA privileges and these work fine (I can expand all the nodes) . Surely we are not restricted to users with DBA privileges when  using EF?

Any help, much appreciated
Thanks
C--Users-Thampi_m-Desktop-efo1.PNG
C--Users-Thampi_m-Desktop-efo2.PNG
0
Hi EE,

I would like to transfer all data from a prod instance DB to a UAT instance DB. However, I would like to omit tables between the two DB where the schemas are different. Is there a way to do this via the GUI if not what would be the command to check the compare the schemas before transfer?

I am using SQL server 2012.

Any assistance is appreciated.

Thank you.
0
Hello,

I want to add a csv file to database. There is a column in csv file called as merchant_id same as column in SQL database. I want to update the row in database if the "merchantid" exists in database or if it doesn't exist , we need to add that row from csv file .


Here  

"rs" contains all the columns of the CSV file and its values.

"rsAdd" contains all the columns of Database table(Merchants) and its values.

Set rs = GetRecordSet("C:\upload\new\" & TheNewFileName2)
    'Response.Write "No Errors"
    Set cn = Server.CreateObject("ADODB.Connection")
    cn.ConnectionString="DSN=REWC"
    cn.Open
    i = 0
		TheSql = "SELECT * FROM Merchants"
        Set rsAdd = Server.CreateObject("ADODB.RecordSet")
        rsAdd.CursorType=1
        rsAdd.LockType=3
        rsAdd.Open TheSql, cn
        
    While not rs.EOF
				
			rsAdd.AddNew            
            For Each col in rs.Fields				
                rsAdd(Replace(col.Name,"_","")) = col.Value
            Next		
			
        rsAdd.Update
        rsAdd.Move 0        
        rs.MoveNext
        i = i + 1
        If (i mod 100)=0 Or i=1 Then
            Response.Write i & ","
            If (i mod 2500) = 0 Then
                Response.Write "<br />"
            End If
        End If
    WEND
	
    rsAdd.Close
    Set rsAdd = Nothing	
	End If

Open in new window

0
so its a basicaly a complaint form ,if the complaint is solved it should remove the row automatically from datagridview
0
I just installed ElasticSearch, MongoDB, GrayLog2 but getting an error and unable to access the web interface. Please, suggest any solution r

/etc/elasticsearch
# ls -lrtha
drwxr-x---.  2 root elasticsearch    6 Apr 24 16:29 scripts
-rwxr-x---.  1 root elasticsearch 2.6K Apr 24 16:04 logging.yml
-rwxr-x---.  1 root elasticsearch 3.2K Jul 20 05:05 elasticsearch.yml


/var/log/graylog-server/server.log

2017-07-20T07:50:36.694Z INFO  [CmdLineTool] Loaded plugin: Elastic Beats Input 2.2.3 [org.graylog.plugins.beats.BeatsInputPlugin]
2017-07-20T07:50:36.696Z INFO  [CmdLineTool] Loaded plugin: Collector 2.2.3 [org.graylog.plugins.collector.CollectorPlugin]
2017-07-20T07:50:36.699Z INFO  [CmdLineTool] Loaded plugin: Enterprise Integration Plugin 2.2.3 [org.graylog.plugins.enterprise_integration.EnterpriseIntegrationPlugin]
2017-07-20T07:50:36.700Z INFO  [CmdLineTool] Loaded plugin: MapWidgetPlugin 2.2.3 [org.graylog.plugins.map.MapWidgetPlugin]
2017-07-20T07:50:36.708Z INFO  [CmdLineTool] Loaded plugin: Pipeline Processor Plugin 2.2.3 [org.graylog.plugins.pipelineprocessor.ProcessorPlugin]
2017-07-20T07:50:36.709Z INFO  [CmdLineTool] Loaded plugin: Anonymous Usage Statistics 2.2.3 [org.graylog.plugins.usagestatistics.UsageStatsPlugin]
2017-07-20T07:50:36.813Z ERROR [CmdLineTool] Invalid configuration
com.github.joschi.jadconfig.ValidationException: Cannot read file elasticsearch_config_file at path /etc/elasticsearch/elasticsearch.yml. Please specify the correct …
0
Hi guys,

Recently came upon the requirement for DATA Center for which I don't have any knowledge.I am in R&S, IP Telephony, Structured Cabling
Can anyone tell how I can get trained in Data center design?
what is the best way to get trained in data center designing, and what are the training courses offered by CISCO, HP, DELL.
0
I  have developed free radius management system locally.
but now in my job they asked me to make this system to be selling in a cloud
ie. every user around the world can access our management system from the cloud
he must register a new account in our system and determine his database
that means every user has a different database with same free radius server
my first problem is when a new user had registered to our system his database information will go in SQL.conf file but as we know that free radius does not take his new configuration till restart so I must restart free radius for every new user and interrupt the others work.
the other question is when there are many users with their database for example if I have 100 users and for every user, there is one database that means free radius must connect to 100 different databases  is this will make free radius working slowly or not
I hope I explained my problem clearly
0
Db: Oracle 11g

We have an etl job run via a third party scheduler that connects to oracle database via db link and fetches the data into a file.

The query is as follows:

select
<bunch of colums>
from
user.view_object@remotedb

This fails on the first run with the following error:

Error code: 8103, Error message: ORA-08103: object no longer exists.

On the re-run it runs just fine.

The object always exists on the remote database. Cannot understand why it fails on the first run?

Any thoughts? Please let me know if you need more information
0
Free Tool: ZipGrep
LVL 9
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

Hi,

I am Rupesh & I am new to this forum.

I am trying to build a tool in excel which can update database maintained in access.

There will be 3 to 4 user for this excel tool which will be connected to one access database.

The excel will act as the front end tool & access as a backend tool.

I want excel VBA to perform 3 things:

1. It will update data in access based on 2 column which will act as a unique code.
2. It will add any new row if the data doesnt match with previous unique code.
3. If the user wants to delete perticular row / range then it will be deleted from database also.

Below sample of table:

Country      Product code      Sub-Brand      Mfg Base
Russia      SRU030008869020137      Syrups      Own
Russia      SRU030008869020141      Syrups      Own
Russia      SRU030008564220158      Tablets      Own
Russia      SRU030008564220258      Tablets      Own
Russia      SRU030008564220558      Tablets      Own
USA      SRU030008869020137      Syrups      Own
USA      SRU030008869020141      Syrups      Own
USA      SRU030008564220158      Tablets      Own
USA      SRU030008564220258      Tablets      Own
USA      SRU030008564220558      Tablets      Own

1st two column will act as a unique key.

We cannot use a replacement code since there will be multiple user working on different region, replacement option can change  or erase others database.

Thanks for the help in advance.

Regards,

Rupesh.
0
I need help improving the query potentially combining the query without using the UNION. And help improving the query execution

SELECT F.* 
FROM ( 
SELECT T.*,
 case WHEN T.availSeats = 1 THEN (SELECT (CONVERT(int, CONVERT(varchar, T.sessionStart, 112)) - CONVERT(int, CONVERT(varchar, U.dateOfBirth, 112)))/10000 
 FROM users U 
 inner join sessionMap SM2 on SM2.userKey = U.userKey WHERE SM2.sessionKey = T.sessionKey) 
 WHEN T.repost = 1 THEN (SELECT (CONVERT(int, CONVERT(varchar, T.sessionStart, 112)) - CONVERT(int, CONVERT(varchar, U.dateOfBirth, 112)))/10000 
 FROM users U inner join sessionMap SM3 on SM3.userKey = U.userKey 
 WHERE SM3.sessionKey = T.sessionKey and SM3.repost = 1) ELSE 0 END as age 
 FROM ( 
 SELECT S.SessionKey, S.locationKey, S.locationKeyList, S.officeStateKey, SU.sessionStart, SU.sessionEnd, L.name AS locationName, L.address1, L.address2, L.city, L.state, L.zip, I.gender,
  ( SELECT TOP 1 UIL.licenseCd FROM userInstructorLicense UIL WHERE UIL.userKey = I.instructorKey and UIL.expirationdt is not NULL )as badgeNum,
   ( SELECT COUNT(1) FROM sessionMap SM WHERE S.sessionKey = SM.sessionKey ) AS seatCount, S.Seats, 
   IsNull(SU.btwSeatsOverride, S.Seats) - ( SELECT COUNT(1) FROM sessionMap SM WHERE S.sessionKey = SM.sessionKey and sm.userKey <> 0 ) AS availSeats,
    LS.btwScheduleThreshold, LS.availStateCode, ( SELECT top 1 isNull(SM.repost,0)
	 FROM SessionMap SM 
	 WHERE SM.sessionKey = S.sessionKey
	  ORDER BY isNull(SM.repost,0) desc ) as 

Open in new window

0
I have over 45 databases that need the log files backed up. I can go through each database and manually backup the log files or prefer not to if it's an option due to time restraints. Is there a way to run transaction log backups using one command or automate the process for all databases.
0
Hi,
I would like to export the posts from an old wordpress site.  I only want the posts, and I want them to override the ones in my new site.  So what do I do in MySQL to export the old posts and then import them into the new table.
Is is as simple as selecting the table and then importing it into the new wordpress site.  I don't want to mess anything on the new site (except the posts of course@)
I hope someone can help me do this one please?
Thank you,
A
0
I know how I can get the data needed, but want to follow Laravel/MVC practices here and new to it.

I have a page that stores data that a user sends through a form.  Once they submit they are redirected to a new page that I'd like to display some of the data back to them.

For example, currently any data I want to display I'm just passing in the URL.
$trip = new Trip;
        $trip->tripName = request('tripName');
        $trip->departing = date('Y-m-d H:i:s', strtotime(request('departing')));
        $trip->returning = date('Y-m-d H:i:s', strtotime(request('returning')));
        $trip->save();
        $tripID = $trip->id;
        $tripName = request('tripName');

        return redirect()->route('flight', ['tripID' => $tripID,'tripName' => $tripName]);

Open in new window


Then within the view I just:
{{ $_GET['tripName'] }}

Open in new window


Is that how I should be doing it?  Or when they hit the new view is there an easy way to grab that data with a DB call?  If so should this be done in the view or do I need to serve somehow from the controller?

Thanks
0
Good afternoon,

I have a database with hundreds of tables and many of them have a field named LOCKED_BY_ID.
I need a query to list all of the tables and values that are not null.

Here is a simplified example of one of the tables.  Columns vary in the tables.

CREATE TABLE [TestDatabase].[PART](
      [ID] [char](32) NOT NULL,
      [CREATED_ON] [datetime] NOT NULL,
      [CREATED_BY_ID] [char](32) NOT NULL,
      [LOCKED_BY_ID] [char](32) NULL,
      [DESCRIPTION] [nvarchar](256) NULL
GO

Thanks!
0
Greetings,

We have a virtual server with 48GB of RAM and 2 Processors. The performance according to SPOTLIGHT is terrible and the getting a lot of Latching and I/O alerts. What SQL Server Database configuration setting should I change to get rid of this problem or should I add more CPU Processors and more RAM? Please advise.
0
My files are not getting uploaded to the database , i guess the tmp folder is not accepting it , how to solve this...??
0
Client is looking to have 3 separate locations connect to FileMaker databases.  Sometimes 2 users will want to access the same file.

Can they set up FileMaker Pro on a Virtual Private Server and use that copy to serve files?   What is the limit of open files for non-server FileMaker at the moment?
IF they purchased FileMaker Server, would that work?

Could they remotely run FileMaker on a VPS?

Is there a way to do this?

Thanks.
0
Efficient way to get backups off site to Azure
LVL 1
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Hi
I am new to Microsoft sql server management studio tips 11.0.210 version connecting to the database using this client IDE.

How to search on the stored proc, function etc names.
what are tips in using it and best practices. please provide good links, resources on it
0
Hi there,
   We have log shipping for 4 databases which are going well but however only Restore Jobs for two databases gets failed in DR environment occasionally. The error message says like this
Executed as user: xxxxxxxxx. The process could not be created for step 1 of job 0x867837754D077D4A9BFF608D000072BD (reason: 5).  The step failed.

I have configured the Job->Step->Advanced->output file to capture the error message but this error was not written in the output file as well. In fact this error doesn't seem to be happened at all as I can see the output file has only log which is written before the last run which was failed.

thanks
Deepak
0
Regards All,

What options are out there if I am to make something to export a quesry result to tab-separated file, and store it on network place.
Size can be about 100Mb, so quite amount of data.

I'd rather not make it Integration service/job but perhaps directly from the Oracle Database.

so:
1. run a query, generates 500k rows
2. export the result to a network place as data.txt (tab-separated)

Maybe so instead,
1. run scheduled query to store data into a temp table
2. run export from temp table to text file.

Preferably as scheduled work, cronjob, anything similar so one doesn't have to do this manually.

/BK
0
I have increase the AD and the Exchange server memory . In the exchange store.exe will take the added memory also  ? is it normal ? in sql database will take the entire physical memory and then release  ? is that the same  ?
0
We have a system in place which is hosted (SaaS) by the software manufacturer/supplier.

We have a small internal team in place to run the day-to-day admin and management of the system, like setting up new users, running reports etc.

However I’ve become concerned as I’ve been informed that the host/supplier can access the database, setup users within the database and general write directly to the tables.  This is obviously bypassing all the application controls that are in place.

We can, and probably will run reports on users so we should be able to identify any news users that might have been set up in this way.  However this is only a small fraction of what could possibly be done via the tables, for example changing the bank details for payments to your own etc.

We can try and monitor this as much as possible but it has its limits and it is detective rather then preventative controls.
The problem I think we have is that they developed this system and now they host it, so in theory they can do whatever they want.  Has anyone else been in this situation, is there anything we can put in place to prevent this happening rather than just detecting it?  At the moment the only thing I can think of is making them promise they won’t do this, which is pretty useless.

They do provide some system support so we can’t just remove their access all together and as we don’t have control over the database, we can’t insist access is disabled until they come through us to enable …
0
Hi Expert:
I've lost ability to connect to MySQL data, the error message as attached. Can I recover data from ibdata1, ib_logfile1, ib_logfile0
on an other machine.
Regards
0

Databases

54K

Solutions

38K

Contributors

Databases are organized collections of data, most commonly accessed through management systems including schemas, tables, queries and processes that allow users to enter and manipulate the information or utilize it in other fashions, such as with web applications or for reporting purposes.