We are putting around 7 million (70 lac) rows data into the SQL server from .net application using a single transactions. This process happens frequently in a single day and we cant use SSIS for this.
The issue is that after this process our database transaction log gets increased by 3 gb, due to this we are constantly running out of disk space.
To solve the disk space issue, temporarily we backup and restoring database (to clear the log files) but this is very unprofessional way and needs human intervention frequently.
Now what we are thinking of is below:
1) Database population starts
2) Log File gets increased automatically
3) Process Finishes (with transaction)
4) Now the Log file should get reduced to original size (of 100MB) with some sql command.
Please suggest some way to solve this problem.