Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Monday, March 26, 2012

Named Instance of SQL Server in Java

I have a java compiled jar file, which accept the connection string as follows: -Uusername -Ppassword -Sservername:port

This string works fine with the default instance of SQL Server, I mean as long as you supply the name of the SQL Server as 'SQLServer', but if I want to run it against the named instance of SQL Server like 'SQLSERVER\NamedInstance' then the connection is failing, giving me the unknown host error. Any idea why?I've got this connection string

driver={SQL Server};server=MachineName\Mercure;uid=sa;pwd=pwd

I've created the instance name during the installation of MSDE
and it seems that SQL Server was installed with the computer name

ONLINE HELP

Computer Name
The Computer Name dialog box in Setup allows you to install Microsoft SQL Server 2000 on your local computer, on a remote computer, or on a virtual server.

All options for installing and upgrading are available on the local computer. Advanced options, including registry rebuild, unattended installation, and upgrading to a cluster are not available on a remote installation. If you are running Setup on a clustered computer, the Virtual Server option is available.

Options
Local Computer

By default, the name in the edit box is the local machine name, that is, the computer on which Setup is running. For a local installation, accept the default and click Next.

Note If you are installing tools only, Local Computer will be the only option available on this dialog box.

Remote Computer

Enter a computer name for a remote installation, or click Browse to locate the remote computer.

Virtual Server

Enter the name of a new or existing Virtual SQL Server to manage.

This option is available only when Microsoft Cluster Service (MSCS) is detected on an Windows NT or Windows 2000 Enterprise operating system.

Browse

Click the Browse button to locate a remote computer.

This button is available only when the Remote Computer option is selected.|||What JDBC driver are you using? If it's Microsoft's then it does support named instances. I think this was introduced in SP2 of the JDBC driver. Let me know and I'll post and example. If it's another jdbc driver usually you can supply the port number the instance is running in the connection string.|||-S//MachineName/DBNAME:1599

where 1599 is the Named Instance Port numbersql

Wednesday, March 21, 2012

Name Flat file at run time

Hi,

I create a flat file in my package using Dataflow task and then I need to send that text file on some FTP site but the problem I am facing is I need to generate file name at run time like file name should be like "abc.yymmdd.txt" where yymmdd are Year,month and date.

Is there any way to give name to a flat file?

Thanks,

Pooja

your Connection Manager has a Property Collection called Expressions. Expand that, and set the ConnectionString to a variable expression.|||

"ABC." + (DT_WSTR,8) ( (YEAR(@.[User:Big SmileateTimeVar]) * 10000) + (MONTH(@.[User:Big SmileateTimeVar]) * 100) + DAY(@.[User:Big SmileateTimeVar]) ) + ".TXT"

Change the variable reference (@.[User:Big SmileateTimeVar]) to GETDATE() if you want today's date to be used.

sql

Name cannot begin with the '>' character, hexadecimal value 0x3E

Hi,

This is my first attempt at SSRS, I've added code section to the xml file. But when I use <> for not equal to, it is giving me this error. What should I do? The code is as posted here.

Thanks,

Debi

***************************************************************************************

<Code>
Dim PrevId as Object
Dim a_Count As Double
Dim b_Count As Double
Dim c_Count As Double
Dim d_Count As Double
Dim e_Count As Double
Dim Total As Double = 0
Function SumMPCount(ByVal CurrID As Object,ByVal code As Object, ByVal NewCount As Object) As Double

If(CurrID is Nothing or code is Nothing or NewCount is Nothing)Then
Exit Function
End if
If PrevId Is Nothing Then
PrevId = CurrId
End if
If CurrID <> PrevID Then
a_Count = 0
b_Count = 0
c_Count = 0
d_Count = 0
e_Count = 0
Total = 0
End if
Select Case mp_Code
Case "a"
a_Count = a_Count + NewCount
SumMPCount = a_Count
case "b"
b_Count = b_Count + NewCount
SumMPCount = b_Count
case "c"
c_Count = c_Count + NewCount
SumMPCount = c_Count
case "d"
d_Count = d_Count + NewCount
SumMPCount = d_Count
case "e"
Total = a_Count + b_Count + c_Count + d_Count + e_Count
SumMPCount = Total
case else
e_Count = e_Count + NewCount
SumMPCount = e_Count
End Select

PrevId = CurrId
End Function
</Code>

It looks like you directly copy&pasted the code snippet into the RDL(XML) file and therefore the <> is not encoded an is interpreted as invalid xml element.

You have two options - either copy&paste the code snippet into the report designer custom code window (and report designer will automatically encode the characters correctly), or replace the <> in the RDL file with &lt;&gt; (which is the encoded representation).

-- Robert

|||

Thanks for the reply. Where is the custom code window? When I right click on the report and select properties, it just shows me the usual Properties tab on the RHS which shows background color, image, etc..

I actually worked around it using Object variables and using "Is" instead of <> .

|||I think he might have meant
right click on the report.rdl -> Prooperties -> Code tab -> paste theresql

n00b: best way to import flat file with BIDS

Hi all,

I'm totally new to SQL Server 2k5 and need to do something rather basic: import some CSV files into tables. I'm getting translation errors and would like to know what's the best way to cast the strings before inserts.

I'm doing the import in BI Development Studio.

Current situation:

Created connection managers to csv files
created SQL server destinations pointing to the tables
connected them directly with a dataflow path
Ran the packadge: one import went just fine, the other one complains about conversion errors like "Conversion DT_STR and DT_I4 not supported"

Both tables have the same kind of fields (varchar, float, datetime, int)

I looked at converting the data using a transformation but am somewhat confused of which one to use.

What's the best way to transform the data before insert: derived column, import column or data conversion? Or something else I overlooked?

TIA

PeterFor doing this kind of work, I prefer to use bcp.exe over SSIS/DTS. It's a simple command line utility.

Look for "bcp utility" in BOL.

Monday, March 19, 2012

Mysteriously dropped rows using Data Flow Task

I'm using the Data Flow Task to load data from a flat file into a SQL table and I'm missing rows. And there doesn't see to be any consistent or obvious reason why.

When I use the Bulk Insert Task I import the correct number of rows from the flat file. But when I use the Data Flow task and use a Flat File Source connected to a OLE DB Destination I get about 1/3 the right number of rows. So looking at these loaded tables at the same time I notice that the Data Flow Task method just skips rows sometimes.

Why does this happen?

Try to enable error output and redirect the error output to a different table and see if any rows are sent there.

Hope this helps,

Ovidiu Burlacu

|||

just for people looking around; this issue got addres in other post: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=884012&SiteID=1

Rafael Salas

Friday, March 9, 2012

mysql installation problem in Xp

hello !

I am not able to install Mysql on my pc runnig XP
..the log file shows server is readyfor connection on port 3306.

but still client not able to server .
error comes can't connect localhaost to server on port 10061

winmysqladmin showing green indiacting server is running but server
and lient info are not there.

pls help.

smita[posted and mailed]

Smita (smitap56@.rediffmail.com) writes:
> I am not able to install Mysql on my pc runnig XP
> .the log file shows server is readyfor connection on port 3306.
> but still client not able to server .
> error comes can't connect localhaost to server on port 10061
> winmysqladmin showing green indiacting server is running but server
> and lient info are not there.

Please try http://www.mysql.com. I believe they have some support forum
there. In this group we discsuss Microsoft SQL Server, so we can't help
you.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp

Wednesday, March 7, 2012

mydatabase.ckp ?

Hi
A file has been generated in my MSSQL\backup directory called [myDB].ckp.
It's only a 6K binary file, but I've no idea what it is.
Any suggestions?
Thanks
GriffI remember seeing this. A CKP file gets created when you cancel a
backup/restore operation. I think it is used for restarting a failed
backup/restore operation.
--
HTH,
Vyas, MVP (SQL Server)
http://vyaskn.tripod.com/
What hardware is your SQL Server running on?
http://vyaskn.tripod.com/poll.htm
"GriffithsJ" <GriffithsJ_520@.hotmail.com> wrote in message
news:%23CBUU%233zDHA.2872@.TK2MSFTNGP09.phx.gbl...
Hi
A file has been generated in my MSSQL\backup directory called [myDB].ckp.
It's only a 6K binary file, but I've no idea what it is.
Any suggestions?
Thanks
Griff|||Cool, thanks.
I can in fact verify this - the update did fail (it wouldn't overwrite the
data and log files).
Griff

My whole diskspace is used by a .LDF file?

As a newbie I work with MS SQL 2005. On this server I suddently ran
out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
database is just 2-3GB.
I think I have learned that I need to backup the database and this
logfile on a daily basis to free the diskspace (or at least to let it
be reused).
But a this moment....
I don't have enough diskspace to backup the logfile and futher more I
don't have tape or DVD or anything else, that could be usefull for
this purpose.
Can anyone give me a workaround, so I can shrink this logfile?
After that I promise to do a daily backup :-)
Thanks
What recovery mode are you in? Do you ever perform backups?
<finn@.boendergaard-jakobsen.dk> wrote in message
news:1171377134.744446.116340@.q2g2000cwa.googlegro ups.com...
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
>
|||On Feb 13, 8:32 am, f...@.boendergaard-jakobsen.dk wrote:
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
To fix your immediate problem, do the following:
1. Run this command - BACKUP LOG dbname WITH TRUNCATE_ONLY
2. DBCC SHRINKFILE logfilename
3. Do a full database backup - BACKUP DATABASE dbname TO
DISK=backupfilename
Now you need to decide what to do to prevent this from happening
again. Backing up the log file once per day is sort of pointless.
The intent behind transaction log backups is to allow you to restore
to a specific point in time, reducing your chances of data loss.
Backing the log up once per day gives you the potential to lose an
entire day's worth of data - if that risk is acceptable to you, just
forget the log file backups, put the database in Simple mode and do
daily full backups. I would suggest doing log backups AT LEAST once
per hour, we do them every five minutes here. Consider the following
scenario:
- nightly full backup at 9:00pm, no log backups
- database fails at 3:00pm
- only option for recovery is to restore the 9:00pm backup, losing all
work done today
Compare to this scenario:
- nightly full backup at 9:00pm
- log backup every 15 minutes
- database fails at 3:00pm
- recovery of all data up to 2:45pm is possible, by restoring the full
backup (WITH NORECOVERY) followed by each subsequent log file backup.
Which would you prefer?

My whole diskspace is used by a .LDF file?

As a newbie I work with MS SQL 2005. On this server I suddently ran
out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
database is just 2-3GB.
I think I have learned that I need to backup the database and this
logfile on a daily basis to free the diskspace (or at least to let it
be reused).
But a this moment....
I don't have enough diskspace to backup the logfile and futher more I
don't have tape or DVD or anything else, that could be usefull for
this purpose.
Can anyone give me a workaround, so I can shrink this logfile?
After that I promise to do a daily backup :-)
ThanksWhat recovery mode are you in? Do you ever perform backups?
<finn@.boendergaard-jakobsen.dk> wrote in message
news:1171377134.744446.116340@.q2g2000cwa.googlegroups.com...
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
>|||On Feb 13, 8:32 am, f...@.boendergaard-jakobsen.dk wrote:
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
To fix your immediate problem, do the following:
1. Run this command - BACKUP LOG dbname WITH TRUNCATE_ONLY
2. DBCC SHRINKFILE logfilename
3. Do a full database backup - BACKUP DATABASE dbname TO
DISK=backupfilename
Now you need to decide what to do to prevent this from happening
again. Backing up the log file once per day is sort of pointless.
The intent behind transaction log backups is to allow you to restore
to a specific point in time, reducing your chances of data loss.
Backing the log up once per day gives you the potential to lose an
entire day's worth of data - if that risk is acceptable to you, just
forget the log file backups, put the database in Simple mode and do
daily full backups. I would suggest doing log backups AT LEAST once
per hour, we do them every five minutes here. Consider the following
scenario:
- nightly full backup at 9:00pm, no log backups
- database fails at 3:00pm
- only option for recovery is to restore the 9:00pm backup, losing all
work done today
Compare to this scenario:
- nightly full backup at 9:00pm
- log backup every 15 minutes
- database fails at 3:00pm
- recovery of all data up to 2:45pm is possible, by restoring the full
backup (WITH NORECOVERY) followed by each subsequent log file backup.
Which would you prefer?

My whole diskspace is used by a .LDF file?

As a newbie I work with MS SQL 2005. On this server I suddently ran
out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
database is just 2-3GB.
I think I have learned that I need to backup the database and this
logfile on a daily basis to free the diskspace (or at least to let it
be reused).
But a this moment....
I don't have enough diskspace to backup the logfile and futher more I
don't have tape or DVD or anything else, that could be usefull for
this purpose.
Can anyone give me a workaround, so I can shrink this logfile?
After that I promise to do a daily backup :-)
ThanksWhat recovery mode are you in? Do you ever perform backups?
<finn@.boendergaard-jakobsen.dk> wrote in message
news:1171377134.744446.116340@.q2g2000cwa.googlegroups.com...
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
>|||On Feb 13, 8:32 am, f...@.boendergaard-jakobsen.dk wrote:
> As a newbie I work with MS SQL 2005. On this server I suddently ran
> out of diskspace and found out that a .LDF file uses 55GB+ ?!? The
> database is just 2-3GB.
> I think I have learned that I need to backup the database and this
> logfile on a daily basis to free the diskspace (or at least to let it
> be reused).
> But a this moment....
> I don't have enough diskspace to backup the logfile and futher more I
> don't have tape or DVD or anything else, that could be usefull for
> this purpose.
> Can anyone give me a workaround, so I can shrink this logfile?
> After that I promise to do a daily backup :-)
> Thanks
To fix your immediate problem, do the following:
1. Run this command - BACKUP LOG dbname WITH TRUNCATE_ONLY
2. DBCC SHRINKFILE logfilename
3. Do a full database backup - BACKUP DATABASE dbname TO
DISK=backupfilename
Now you need to decide what to do to prevent this from happening
again. Backing up the log file once per day is sort of pointless.
The intent behind transaction log backups is to allow you to restore
to a specific point in time, reducing your chances of data loss.
Backing the log up once per day gives you the potential to lose an
entire day's worth of data - if that risk is acceptable to you, just
forget the log file backups, put the database in Simple mode and do
daily full backups. I would suggest doing log backups AT LEAST once
per hour, we do them every five minutes here. Consider the following
scenario:
- nightly full backup at 9:00pm, no log backups
- database fails at 3:00pm
- only option for recovery is to restore the 9:00pm backup, losing all
work done today
Compare to this scenario:
- nightly full backup at 9:00pm
- log backup every 15 minutes
- database fails at 3:00pm
- recovery of all data up to 2:45pm is possible, by restoring the full
backup (WITH NORECOVERY) followed by each subsequent log file backup.
Which would you prefer?

My SSIS problems text file, date data types

There are things I just don't get about MS programming:

1. Text file source - recognizes column as "database date [DT_DBDATE]" - but not able to process it as date - values like "2006-12-06T13:16:04"

2. When using import export wizard - it errors out when trying to transfer large amount of rows (40 million) - 1st time out of space error - but the db is in simple recovery model - and even though I increased logs to max & moved them to different HDD - I DO NOT NEED the logging.

3. MS Full text search & full-text index performace issues - our keyword search more often than not times out (it's probably not related to SSIS - but if anyone has suggestions - please do)

I just wish that I could spend more time on doing stuff - so any help on those problems are appreciated.
Thanks,The fact I got problem no.2 makes me look stupid - and I don't like to look stupid - so therefore I blame the thing which made me look stupid = MS.

What you need to do to solve number 2 is: save wizard output to SSIS package, change destination setting "Maximum insert commit size" to appropriate value (1mill in my case).

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID|||

TheViewMaster wrote:

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID

I disagree. What should the default be, then?|||If only they'd add the mind reading interface I keep requesting Smile|||

Perhaps another way to look at this is in the context of "reasonable defaults." I would not expect the default import/export size to be 40 M rows, so I personally do not think that the package created by default should need to handle this data volume without some sort of review or modification.

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

|||

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.|||

Phil Brammer wrote:

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.

Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

|||

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

|||

jw6587 wrote:

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

Wow.

I certainly cannot argue with your experience, but for me the opposite was true. For me, DTS was practically impossible to work with for anything but the most trivial uses. My standard line when speaking on BI topics is that would prefer to chew off my own mouse finger rather than work with DTS on a project. SSIS certainly has its faults (and upgeading DTS packages is definitely not one of its strengths), but for data warehousing and other real-world ETL purposes it is light-years ahead of DTS.

|||

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil|||

Phil Brammer wrote:

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil

Apparently not. The only field that I can edit is the private/public flag.

|||

MatthewRoche wrote:


Apparently not. The only field that I can edit is the private/public flag.

That is weird. I just checked one of mine and I can edit most fields. Especially the category drop down, which is what needs to be changed in this case. I can't offer any help on that, I guess.|||Awesome feedback (and at least I am not the only one who experiences technical difficulties with this product ) - thanks guys.

The only improvement over DTS I see is the error handling/logging mechanism. Since working with SSIS close to a year now - I've trimmed down my toolbox to just "Execute SQL", "Script Task" and Very Simple data-flow.

One thing I've spent a lot of time on though is setting up the proper data types for text file (I've already moaned about how cumbersome that is - there should be a way to input your format file instead of typing & mouse selecting) - but by looking that SSIS can't handle datetime + it is difficult to correct production SSIS package when text file field changes (with the amount of files we process it happens quite often - few times a month perhaps) - our consultant suggested to import all data as string to database - and then use SQL to do conversions. In that case when a file changes - you only need to modify your SQL (views)

So what are your thoughts about trimming down SSIS to just importing everything as text to database?|||

TheViewMaster wrote:

So what are your thoughts about trimming down SSIS to just importing everything as text to database?

My thoughts are its a stupid idea.

My SSIS problems text file, date data types

There are things I just don't get about MS programming:

1. Text file source - recognizes column as "database date [DT_DBDATE]" - but not able to process it as date - values like "2006-12-06T13:16:04"

2. When using import export wizard - it errors out when trying to transfer large amount of rows (40 million) - 1st time out of space error - but the db is in simple recovery model - and even though I increased logs to max & moved them to different HDD - I DO NOT NEED the logging.

3. MS Full text search & full-text index performace issues - our keyword search more often than not times out (it's probably not related to SSIS - but if anyone has suggestions - please do)

I just wish that I could spend more time on doing stuff - so any help on those problems are appreciated.
Thanks,The fact I got problem no.2 makes me look stupid - and I don't like to look stupid - so therefore I blame the thing which made me look stupid = MS.

What you need to do to solve number 2 is: save wizard output to SSIS package, change destination setting "Maximum insert commit size" to appropriate value (1mill in my case).

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID|||

TheViewMaster wrote:

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID

I disagree. What should the default be, then?|||If only they'd add the mind reading interface I keep requesting Smile|||

Perhaps another way to look at this is in the context of "reasonable defaults." I would not expect the default import/export size to be 40 M rows, so I personally do not think that the package created by default should need to handle this data volume without some sort of review or modification.

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

|||

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.|||

Phil Brammer wrote:

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.

Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

|||

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

|||

jw6587 wrote:

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

Wow.

I certainly cannot argue with your experience, but for me the opposite was true. For me, DTS was practically impossible to work with for anything but the most trivial uses. My standard line when speaking on BI topics is that would prefer to chew off my own mouse finger rather than work with DTS on a project. SSIS certainly has its faults (and upgeading DTS packages is definitely not one of its strengths), but for data warehousing and other real-world ETL purposes it is light-years ahead of DTS.

|||

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil|||

Phil Brammer wrote:

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil

Apparently not. The only field that I can edit is the private/public flag.

|||

MatthewRoche wrote:


Apparently not. The only field that I can edit is the private/public flag.

That is weird. I just checked one of mine and I can edit most fields. Especially the category drop down, which is what needs to be changed in this case. I can't offer any help on that, I guess.|||Awesome feedback (and at least I am not the only one who experiences technical difficulties with this product ) - thanks guys.

The only improvement over DTS I see is the error handling/logging mechanism. Since working with SSIS close to a year now - I've trimmed down my toolbox to just "Execute SQL", "Script Task" and Very Simple data-flow.

One thing I've spent a lot of time on though is setting up the proper data types for text file (I've already moaned about how cumbersome that is - there should be a way to input your format file instead of typing & mouse selecting) - but by looking that SSIS can't handle datetime + it is difficult to correct production SSIS package when text file field changes (with the amount of files we process it happens quite often - few times a month perhaps) - our consultant suggested to import all data as string to database - and then use SQL to do conversions. In that case when a file changes - you only need to modify your SQL (views)

So what are your thoughts about trimming down SSIS to just importing everything as text to database?|||

TheViewMaster wrote:

So what are your thoughts about trimming down SSIS to just importing everything as text to database?

My thoughts are its a stupid idea.

My SSIS problems

There are things I just don't get about MS programming:

1. Text file source - recognizes column as "database date [DT_DBDATE]" - but not able to process it as date - values like "2006-12-06T13:16:04"

2. When using import export wizard - it errors out when trying to transfer large amount of rows (40 million) - 1st time out of space error - but the db is in simple recovery model - and even though I increased logs to max & moved them to different HDD - I DO NOT NEED the logging.

3. MS Full text search & full-text index performace issues - our keyword search more often than not times out (it's probably not related to SSIS - but if anyone has suggestions - please do)

I just wish that I could spend more time on doing stuff - so any help on those problems are appreciated.
Thanks,The fact I got problem no.2 makes me look stupid - and I don't like to look stupid - so therefore I blame the thing which made me look stupid = MS.

What you need to do to solve number 2 is: save wizard output to SSIS package, change destination setting "Maximum insert commit size" to appropriate value (1mill in my case).

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID|||

TheViewMaster wrote:

By default Import/Export wizard tries to save all records in 1 transaction - BUT saving 40 million rows in 1 shot (even in simple recovery model) is STUPID STUPID STUPID

I disagree. What should the default be, then?|||If only they'd add the mind reading interface I keep requesting Smile|||

Perhaps another way to look at this is in the context of "reasonable defaults." I would not expect the default import/export size to be 40 M rows, so I personally do not think that the package created by default should need to handle this data volume without some sort of review or modification.

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

|||

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.|||

Phil Brammer wrote:

MatthewRoche wrote:

Still, it would not hurt to have an "Advanced Options" (or whatever - pick your name) page in the wizard that asked data size and volume questions, and set some of these properties more intelligently based on this input.

This is a good idea. Either yourself or TheViewMaster should post a Connect feedback item requesting this feature to be added.

Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

|||

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

|||

jw6587 wrote:

I'm having a heck of a time converting a very simple text file import dts package (takes 2 minutes to create the whole thing in dts). Initially, I had the same problem as you - couldn't get the date columns to work no matter what I tried. Finally I changed the destination columns to strings (now I'll see if I can convert the strings in that table to another table with the proper 'date' data type.

I'm also having trouble with the thing telling me it can't import a column (from the same file) because of potential data loss (I've tried real, float, numeric types). The source data is not that large (less than 10,000,000.00).

I 've been a huge fan of Microsoft, SQL Server (especially dts) since SS 7.0. So far, this is the worst software product I've ever encountered.

Wow.

I certainly cannot argue with your experience, but for me the opposite was true. For me, DTS was practically impossible to work with for anything but the most trivial uses. My standard line when speaking on BI topics is that would prefer to chew off my own mouse finger rather than work with DTS on a project. SSIS certainly has its faults (and upgeading DTS packages is definitely not one of its strengths), but for data warehousing and other real-world ETL purposes it is light-years ahead of DTS.

|||

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil|||

Phil Brammer wrote:

MatthewRoche wrote:


Posted: https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=278281

Odd that there is no SSIS category available...

There is... Sort of. It's DTS.

Can you edit the posting and change it to DTS?

Thanks,
Phil

Apparently not. The only field that I can edit is the private/public flag.

|||

MatthewRoche wrote:


Apparently not. The only field that I can edit is the private/public flag.

That is weird. I just checked one of mine and I can edit most fields. Especially the category drop down, which is what needs to be changed in this case. I can't offer any help on that, I guess.|||Awesome feedback (and at least I am not the only one who experiences technical difficulties with this product ) - thanks guys.

The only improvement over DTS I see is the error handling/logging mechanism. Since working with SSIS close to a year now - I've trimmed down my toolbox to just "Execute SQL", "Script Task" and Very Simple data-flow.

One thing I've spent a lot of time on though is setting up the proper data types for text file (I've already moaned about how cumbersome that is - there should be a way to input your format file instead of typing & mouse selecting) - but by looking that SSIS can't handle datetime + it is difficult to correct production SSIS package when text file field changes (with the amount of files we process it happens quite often - few times a month perhaps) - our consultant suggested to import all data as string to database - and then use SQL to do conversions. In that case when a file changes - you only need to modify your SQL (views)

So what are your thoughts about trimming down SSIS to just importing everything as text to database?|||

TheViewMaster wrote:

So what are your thoughts about trimming down SSIS to just importing everything as text to database?

My thoughts are its a stupid idea.

my SQL Server is too slow

Hi, All
recently, after I restored production DB (SQL7.0 +sp3)
backup file to my develope DB, my computer (develope DB --
desktop version)is too slow when I access develope DB. It
was fast before and it is slow right now. I didn't change
anything to my computer or server. The SQL server is still
fast, only my computer run the desktop develope DB is
slow. my pc is p4 2.4Ghz and 256 RM 120GB hard drive. I
checked my pc using Norton AntiVirus and non virus found.
Any ideas or suggestions?
Thanks and have a good one
HawkMost likely the 120GB Hard Drive has absolutely no comparison of performance
against the server. Server could have RAID or high cache disk drives. Also,
that may be a small amount of RAM for background services running locally to
your workstation.
Christopher Winn
Business Intelligence Engineer
Edugration, Corp.
"hawk" <abchawk@.hotmail.com> wrote in message
news:11f4201c410ed$bb3b8470$a401280a@.phx
.gbl...
> Hi, All
> recently, after I restored production DB (SQL7.0 +sp3)
> backup file to my develope DB, my computer (develope DB --
> desktop version)is too slow when I access develope DB. It
> was fast before and it is slow right now. I didn't change
> anything to my computer or server. The SQL server is still
> fast, only my computer run the desktop develope DB is
> slow. my pc is p4 2.4Ghz and 256 RM 120GB hard drive. I
> checked my pc using Norton AntiVirus and non virus found.
> Any ideas or suggestions?
> Thanks and have a good one
> Hawk|||To quantify what Chris Winn said, you should run Performance Monitor and
check % Processor Time (Processor), % Disc Time (PhysicalDisc), and %
Committed Bytes In Use (Memory) while you're running your queries. I think
what you'll see is % Disc Time very close to 100, % Processor Time very low,
and % Committed Bytes In Use very high.
"hawk" <abchawk@.hotmail.com> wrote in message
news:11f4201c410ed$bb3b8470$a401280a@.phx
.gbl...
> Hi, All
> recently, after I restored production DB (SQL7.0 +sp3)
> backup file to my develope DB, my computer (develope DB --
> desktop version)is too slow when I access develope DB. It
> was fast before and it is slow right now. I didn't change
> anything to my computer or server. The SQL server is still
> fast, only my computer run the desktop develope DB is
> slow. my pc is p4 2.4Ghz and 256 RM 120GB hard drive. I
> checked my pc using Norton AntiVirus and non virus found.
> Any ideas or suggestions?
> Thanks and have a good one
> Hawk|||Hi, Winn,
I;m sorry. What I want to say is my desktop version SQL
7.0 server is slow. It was fast. right now, it was slow
even i click any item from EM, of couse, the query is slow
too. My desltop version SQL server is develope DB and only
me can access it, none other uses. I try t0 rebuild the
index and to see what happen
my production DB on the win2000 server is still fast. not
problem at all
thanks!
hawk

>--Original Message--
>Most likely the 120GB Hard Drive has absolutely no
comparison of performance
>against the server. Server could have RAID or high cache
disk drives. Also,
>that may be a small amount of RAM for background services
running locally to
>your workstation.
>Christopher Winn
>Business Intelligence Engineer
>Edugration, Corp.
>|||As for the slowness, see if the autoclose database option is on for the
databases. Also, see if ODBC tracing is turned on at the client.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
"Hawk" <anonymous@.discussions.microsoft.com> wrote in message
news:d76b01c410f8$8e6b5870$a001280a@.phx.gbl...
> Hi, Winn,
> I;m sorry. What I want to say is my desktop version SQL
> 7.0 server is slow. It was fast. right now, it was slow
> even i click any item from EM, of couse, the query is slow
> too. My desltop version SQL server is develope DB and only
> me can access it, none other uses. I try t0 rebuild the
> index and to see what happen
> my production DB on the win2000 server is still fast. not
> problem at all
>
> thanks!
> hawk
>
>
> comparison of performance
> disk drives. Also,
> running locally to
>|||thanks Tibor,
autoclose database is on and i will set it to off.
but how can I know ODBC tracing is turned on at client?
How can I check it ? Could you tell me about it?
Thnaks and have a good day
hawk
>--Original Message--
>As for the slowness, see if the autoclose database option
is on for the
>databases. Also, see if ODBC tracing is turned on at the
client.
>--
>Tibor Karaszi, SQL Server MVP
>http://www.karaszi.com/sqlserver/default.asp
>|||You find the ODBC applet in the Administrative Tools program group. Here you
can see if it is running and if so, turn it off.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
"hawk" <anonymous@.discussions.microsoft.com> wrote in message
news:f32e01c410ff$18243ef0$a601280a@.phx.gbl...
> thanks Tibor,
> autoclose database is on and i will set it to off.
> but how can I know ODBC tracing is turned on at client?
> How can I check it ? Could you tell me about it?
> Thnaks and have a good day
> hawk
> is on for the
> client.
>

Monday, February 20, 2012

My package does not take the values configured in the configuration file.

Hi all,

I created a simple file system task to copy a file from one location to another on my local system.

I used variables to specify the source and destination paths and now in my configuration file if I configure the values to something else also the package is taking the initial default values and not the values specified in the configuration file.

Can anyone throw some light on this.

Thanks in advance.

first of all check that u are using configutaion file in your package correclty

second check that variable is being mapped (present) in your configuration

these are all i can assume from ur explained scenario

write back to me if u find ne luck

regards,

Anas

|||Also, look at the execution log of the package; a warningis generated each time a configuration does not take place. If you are executing it via BIDS, go to the progress tab; otherwise enable package logging.

My Own - Multi Select Parameter Passing Implementation

Hi,
I would like to share how I implemented in my application by passing multi
select parameters into the report.
1. In my .net file, I have created a list box which I load all the multi
values that the user wants to report on(for Ex: Equipment Id = 100,200,300,400 and so on).
2. When the user submits the page, I load the above list into a table.
3. In my stored procedure, I go after this table and build the SELECT
statement.
4. Since the report has to refreshed everytime they add another value into
the list, the report has to know that it has to refresh. For this what I have
done is on the table that i load all the values, I extract the SEQUENCE # and
pass it to the report(it is a unique # everytime when you refresh the page).
U can also pass a Random # but the uniqueness is not guaranteed and the
report may not refresh.
In the Report Designer:
1. I create a parameter called RefreshReport. This value that gets passed
into the parameter is the SEQUENCE value.
I poured my head into 3 books for some clue and couldn't find one hint.
I will be more than happy to answer any questions regarding this
implementation..hi, can you be more detailing? I've been following a guide from a book,
there's this query:
="SELECT * FROM vProductProfitability WHERE (Year = @.Year) AND (Mon
thNumberOfYear = @.Month) AND (Category IN (â'+Parameters!CategoryStrin
g.Value+â'))â'
yes, like what you said, you created a new parameter, in here, the the param
is category string, I dn't know where to put this query since i'm using a
stored procedure.. the only way we can use this query is by using text, so
about your stored procedure that you changed.. what is it?
what i can still select is 1 item, although i'm using checkboxlist, i can
select all of them, but w/ an error of course.
can you please discuss how you manage to enable multi select, since you've
posted a helping memo in here already.. advance thanks!
--
~SiMPLe~
"Suresh" wrote:
> Hi,
> I would like to share how I implemented in my application by passing multi
> select parameters into the report.
> 1. In my .net file, I have created a list box which I load all the multi
> values that the user wants to report on(for Ex: Equipment Id => 100,200,300,400 and so on).
> 2. When the user submits the page, I load the above list into a table.
> 3. In my stored procedure, I go after this table and build the SELECT
> statement.
> 4. Since the report has to refreshed everytime they add another value into
> the list, the report has to know that it has to refresh. For this what I have
> done is on the table that i load all the values, I extract the SEQUENCE # and
> pass it to the report(it is a unique # everytime when you refresh the page).
> U can also pass a Random # but the uniqueness is not guaranteed and the
> report may not refresh.
> In the Report Designer:
> 1. I create a parameter called RefreshReport. This value that gets passed
> into the parameter is the SEQUENCE value.
> I poured my head into 3 books for some clue and couldn't find one hint.
> I will be more than happy to answer any questions regarding this
> implementation..

My menubar is all messed up

Ok..this is a weird one.
When I open up SQL Server Management Studio Express my menubar has multiple
instances of the File, Edit, View...menu options... and every toolbar
(Standard, Query Designer, Database Diagram, etc..) all have multiple options
as well.
I've gone in an modified those items as well and I get it to a point where
it looks ok, then when I shut it down and re-open it, all my changes are gone
and it's back to the way it was.
Can anyone help me on this one? Is this the right place to post this
question?
Thanks in advance,
John Scott.
SQL Server 2005 stores it's configuration information in following location:
My Documents\SQL Server Management
Studio\Settings\CurrentSettings-2007-06-06.vssettings
Check to make sure there is no writing problems to that.
The path information for the config file is located at
HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL
Server\90\Tools\Shell\Profile\
Thanks!
Mohit K. Gupta
B.Sc. CS, Minor Japanese
MCTS: SQL Server 2005
"John Scott" wrote:

> Ok..this is a weird one.
> When I open up SQL Server Management Studio Express my menubar has multiple
> instances of the File, Edit, View...menu options... and every toolbar
> (Standard, Query Designer, Database Diagram, etc..) all have multiple options
> as well.
> I've gone in an modified those items as well and I get it to a point where
> it looks ok, then when I shut it down and re-open it, all my changes are gone
> and it's back to the way it was.
> Can anyone help me on this one? Is this the right place to post this
> question?
>
> --
> Thanks in advance,
> John Scott.
|||Thank for the reply Mohit,
I checked two things...there are some goofy entries in the vssettings file
in my My Documents...file.
I made changes to it, but there is no change when I open up SQL Express...
I also checked and I don't have the
HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL
Server\90\Tools\Shell\Profile\ key I have all the way up to the Shell..I
have no Profile in there.
Other suggestions?
Thanks,
John Scott.
"Mohit K. Gupta" wrote:
[vbcol=seagreen]
> SQL Server 2005 stores it's configuration information in following location:
> My Documents\SQL Server Management
> Studio\Settings\CurrentSettings-2007-06-06.vssettings
> Check to make sure there is no writing problems to that.
> The path information for the config file is located at
> HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL
> Server\90\Tools\Shell\Profile\
> Thanks!
> --
> Mohit K. Gupta
> B.Sc. CS, Minor Japanese
> MCTS: SQL Server 2005
>
> "John Scott" wrote:
|||Sorry for late reply.
Sorry the proper path in registry is
HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL
Server\90\Tools\ShellSEM\Profile\.
I can't get the issue to replicate. Here are two options ...
- Delete the CurrentSettings-DATE.vsssettings file.
This will force SSMSEE to reinitialize the settings. Do a search on your
computer for vsssettings delete anything with CurrentSettings in it.
OR
- Try Reinstalling the SSMSEE.
Mohit K. Gupta
B.Sc. CS, Minor Japanese
MCTS: SQL Server 2005
|||Ok...I did a few things.
1. I deleted my CurrentSettings file in ssmses settings folder...when I
opened up ssmses it poped up a message saying it coudn't find my settings
folder and that it would use the most recent settings for this session
2. I checked that registry path...and i did find it, didn't find anything
out of the ordinary...it had 13 values inside that key
AutoSaveFile
AutoSaveFILETIMEHigh
AutoSaveFILETimeLow
BuildNum
DefaultSettinsDirectory
LastResetSettingsFile
SettingsCachesFILEIMEHigh
SettingsCachesFILEIMELow
Sku
TeamSettingsFile
TeamSettingsFILETIMEHigh
TeamSettingsFILETIMELow
TraackTeamSettings
for the LastResetSettingsFile the value was
'%vsspv_vs_install_directory%\Profiles\ssmsee.vsse ttings'
I checked out this file and there is a node called Caegory
name="Environment_WindowLayout"
inside it, there is another node call "PropertyVale name="WindowLayout">
then after that there is a whole bunch of garbled text (looks like viewstate
data)
then the PropertyValue node closes...
I don't know if that is normal or not...
Any other suggestions?
Thanks,
John Scott.
"Mohit K. Gupta" wrote:

> Sorry for late reply.
> Sorry the proper path in registry is
> HKEY_CURRENT_USER\Software\Microsoft\Microsoft SQL
> Server\90\Tools\ShellSEM\Profile\.
> I can't get the issue to replicate. Here are two options ...
> - Delete the CurrentSettings-DATE.vsssettings file.
> This will force SSMSEE to reinitialize the settings. Do a search on your
> computer for vsssettings delete anything with CurrentSettings in it.
> OR
> - Try Reinstalling the SSMSEE.
> --
> Mohit K. Gupta
> B.Sc. CS, Minor Japanese
> MCTS: SQL Server 2005
>
|||Delete all settings files. I would just try reinstalling though.
Mohit K. Gupta
B.Sc. CS, Minor Japanese
MCTS: SQL Server 2005
"John Scott" wrote:
[vbcol=seagreen]
> Ok...I did a few things.
> 1. I deleted my CurrentSettings file in ssmses settings folder...when I
> opened up ssmses it poped up a message saying it coudn't find my settings
> folder and that it would use the most recent settings for this session
> 2. I checked that registry path...and i did find it, didn't find anything
> out of the ordinary...it had 13 values inside that key
> AutoSaveFile
> AutoSaveFILETIMEHigh
> AutoSaveFILETimeLow
> BuildNum
> DefaultSettinsDirectory
> LastResetSettingsFile
> SettingsCachesFILEIMEHigh
> SettingsCachesFILEIMELow
> Sku
> TeamSettingsFile
> TeamSettingsFILETIMEHigh
> TeamSettingsFILETIMELow
> TraackTeamSettings
>
> for the LastResetSettingsFile the value was
> '%vsspv_vs_install_directory%\Profiles\ssmsee.vsse ttings'
> I checked out this file and there is a node called Caegory
> name="Environment_WindowLayout"
> inside it, there is another node call "PropertyVale name="WindowLayout">
> then after that there is a whole bunch of garbled text (looks like viewstate
> data)
> then the PropertyValue node closes...
> I don't know if that is normal or not...
> Any other suggestions?
>
> --
> Thanks,
> John Scott.
>
> "Mohit K. Gupta" wrote:

my Log file size is 10GB....

SQL log file size on 22/06/2007 shown as 10GB...now its showing as 20GB...but when taking backup...the backup file size is 4MB only.....when restoring, it showing msg that there is no space, due to the file size of 10 GB.... any body can tell the solution for this?.....plz urgenthi,
run the following commands.

1.dbcc sqlperf(logspace) . in the out put u can get the log size.
2.backup log <dbname> with truncate_only
3.dbcc shrinkdatabase(dbname,10)
4.dbcc shrinkfile(2) (2 means log file)

after these steps over execute step 1 again after 10 or 20 mins ,let me know the log size growth.
good luck
Bharat|||

Quote:

Originally Posted by bharadwaj

hi,
run the following commands.

1.dbcc sqlperf(logspace) . in the out put u can get the log size.
2.backup log <dbname> with truncate_only
3.dbcc shrinkdatabase(dbname,10)
4.dbcc shrinkfile(2) (2 means log file)

after these steps over execute step 1 again after 10 or 20 mins ,let me know the log size growth.
good luck
Bharat


=============

Hi Bharadwaj.
Thank you for your quickest solution. I will try and get back to you.

Good day.
Alagar

My log file is growing huge! Why? Help ASAP! please

Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
seeing this multiple times each day:
---
Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled by
user or timed out after __ milliseconds. Use ALTER DATABASE to set a smaller
FileGrowth value for this file or to explicitly set a new file size.
---
(file growth = 10%)
At the same time, applications are experiencing timeouts on reads/writes,
which is a major problem, probably (I'm guessing) because this is causing a
database lock. This particular DB is about 3 GB and the log file is about
30 GB. The log file was only 1 mb before I did the upgrade. I have no
idea why this is happening. I am not using replication or log shipping or
anything like that.
a) what gets stored in the log file?
b) why is mine growing out of control?
c) how do I stop the growth?
d) how do I get the log file back to a small size again?Do a backup of the transaction log and then shrink the file.
Ben Nevarez, MCDBA, OCP
Database Administrator
"D. Patrick" wrote:
> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
> seeing this multiple times each day:
> ---
> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled by
> user or timed out after __ milliseconds. Use ALTER DATABASE to set a smaller
> FileGrowth value for this file or to explicitly set a new file size.
> ---
> (file growth = 10%)
> At the same time, applications are experiencing timeouts on reads/writes,
> which is a major problem, probably (I'm guessing) because this is causing a
> database lock. This particular DB is about 3 GB and the log file is about
> 30 GB. The log file was only 1 mb before I did the upgrade. I have no
> idea why this is happening. I am not using replication or log shipping or
> anything like that.
> a) what gets stored in the log file?
> b) why is mine growing out of control?
> c) how do I stop the growth?
> d) how do I get the log file back to a small size again?
>
>|||Thanks for the reply, but I don't believe that solves the underlying
problem. Why is it growing? What should look at?
And BOL says that shrinking the file may not actually shrink the physical
file. I am very confused and would appreciate any information or
step-by-step instructions. Thank you.
"Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...
> Do a backup of the transaction log and then shrink the file.
> Ben Nevarez, MCDBA, OCP
> Database Administrator
>
> "D. Patrick" wrote:
>> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
>> seeing this multiple times each day:
>> ---
>> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled
>> by
>> user or timed out after __ milliseconds. Use ALTER DATABASE to set a
>> smaller
>> FileGrowth value for this file or to explicitly set a new file size.
>> ---
>> (file growth = 10%)
>> At the same time, applications are experiencing timeouts on reads/writes,
>> which is a major problem, probably (I'm guessing) because this is causing
>> a
>> database lock. This particular DB is about 3 GB and the log file is
>> about
>> 30 GB. The log file was only 1 mb before I did the upgrade. I have no
>> idea why this is happening. I am not using replication or log shipping
>> or
>> anything like that.
>> a) what gets stored in the log file?
>> b) why is mine growing out of control?
>> c) how do I stop the growth?
>> d) how do I get the log file back to a small size again?
>>|||A reason for the growth is that your DB is perhaps in Fully logged
recovery mode and maybe it was in simple recovery mode on the old
server
Take a look at this it might help you out
http://www.karaszi.com/SQLServer/info_dont_shrink.asp
Denis the SQL Menace
http://sqlservercode.blogspot.com/|||That is true. I changed that option recently. Does that option really use
10x the space of the database (and more, it seems, by the day)? It seems
that it is useless if it uses 3 GB per day on a 3 GB database. Is there
some other setting that removes log info after each database backup, so that
it will only keep a bit of log data?
Can I change that back to simple while the production DB is running?
I have read BOL and your URL links about shrinking database files. Maybe
I'm just stupid, but I still have no idea what to type into the query window
to get back to a nice little 1 MB log file (instead of the 30 GB and growing
it is at right now). Please help. Thanks.
"SQL" <denis.gobo@.gmail.com> wrote in message
news:1147461310.310058.75290@.g10g2000cwb.googlegroups.com...
>A reason for the growth is that your DB is perhaps in Fully logged
> recovery mode and maybe it was in simple recovery mode on the old
> server
> Take a look at this it might help you out
> http://www.karaszi.com/SQLServer/info_dont_shrink.asp
>
> Denis the SQL Menace
> http://sqlservercode.blogspot.com/
>|||It is normal for the transaction log file to grow. Are you doing regular
transaction log backups?
Try that first (backup and shrink) and let us know what happen.
Ben Nevarez, MCDBA, OCP
Database Administrator
"D. Patrick" wrote:
> Thanks for the reply, but I don't believe that solves the underlying
> problem. Why is it growing? What should look at?
> And BOL says that shrinking the file may not actually shrink the physical
> file. I am very confused and would appreciate any information or
> step-by-step instructions. Thank you.
>
> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
> news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...
> >
> > Do a backup of the transaction log and then shrink the file.
> >
> > Ben Nevarez, MCDBA, OCP
> > Database Administrator
> >
> >
> > "D. Patrick" wrote:
> >
> >> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
> >> seeing this multiple times each day:
> >>
> >> ---
> >> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled
> >> by
> >> user or timed out after __ milliseconds. Use ALTER DATABASE to set a
> >> smaller
> >> FileGrowth value for this file or to explicitly set a new file size.
> >> ---
> >> (file growth = 10%)
> >>
> >> At the same time, applications are experiencing timeouts on reads/writes,
> >> which is a major problem, probably (I'm guessing) because this is causing
> >> a
> >> database lock. This particular DB is about 3 GB and the log file is
> >> about
> >> 30 GB. The log file was only 1 mb before I did the upgrade. I have no
> >> idea why this is happening. I am not using replication or log shipping
> >> or
> >> anything like that.
> >>
> >> a) what gets stored in the log file?
> >> b) why is mine growing out of control?
> >> c) how do I stop the growth?
> >> d) how do I get the log file back to a small size again?
> >>
> >>
> >>
>
>|||I have not been backing up the log. Is that the problem? When you do a log
backup, does that automatically then drop the file back to 1 MB again? A
command example that I can type in the query window would greatly help me.
"Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
news:1CAE345F-A36B-4840-B190-495705186E06@.microsoft.com...
> It is normal for the transaction log file to grow. Are you doing regular
> transaction log backups?
> Try that first (backup and shrink) and let us know what happen.
> Ben Nevarez, MCDBA, OCP
> Database Administrator
>
> "D. Patrick" wrote:
>> Thanks for the reply, but I don't believe that solves the underlying
>> problem. Why is it growing? What should look at?
>> And BOL says that shrinking the file may not actually shrink the physical
>> file. I am very confused and would appreciate any information or
>> step-by-step instructions. Thank you.
>>
>> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
>> news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...
>> >
>> > Do a backup of the transaction log and then shrink the file.
>> >
>> > Ben Nevarez, MCDBA, OCP
>> > Database Administrator
>> >
>> >
>> > "D. Patrick" wrote:
>> >
>> >> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
>> >> seeing this multiple times each day:
>> >>
>> >> ---
>> >> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was
>> >> cancelled
>> >> by
>> >> user or timed out after __ milliseconds. Use ALTER DATABASE to set a
>> >> smaller
>> >> FileGrowth value for this file or to explicitly set a new file size.
>> >> ---
>> >> (file growth = 10%)
>> >>
>> >> At the same time, applications are experiencing timeouts on
>> >> reads/writes,
>> >> which is a major problem, probably (I'm guessing) because this is
>> >> causing
>> >> a
>> >> database lock. This particular DB is about 3 GB and the log file is
>> >> about
>> >> 30 GB. The log file was only 1 mb before I did the upgrade. I have
>> >> no
>> >> idea why this is happening. I am not using replication or log
>> >> shipping
>> >> or
>> >> anything like that.
>> >>
>> >> a) what gets stored in the log file?
>> >> b) why is mine growing out of control?
>> >> c) how do I stop the growth?
>> >> d) how do I get the log file back to a small size again?
>> >>
>> >>
>> >>
>>|||backup log with TRUNCATE_ONLY will release space, however you still
need to issue a SHRINK command after that
Denis the SQL Menace
http://sqlservercode.blogspot.com/|||Two questions:
Can I do this on the production server without causing live problems?
Should I do this in a script, in the future, after all full database backups
(assuming that I don't want to roll back prior to the full backup moment)?
"SQL" <denis.gobo@.gmail.com> wrote in message
news:1147463618.444554.51680@.i40g2000cwc.googlegroups.com...
> backup log with TRUNCATE_ONLY will release space, however you still
> need to issue a SHRINK command after that
> Denis the SQL Menace
> http://sqlservercode.blogspot.com/
>|||My database is similar in size to yours (actually a little larger). I have
SQL Agent backing up the log every 15 minutes with full logging on. This
keeps the log file from growing too large. Also, in the event of a failure,
I am able to restore to the point 15 minutes (sometimes 30) before the
failure. We (OK my boss) decided this was an acceptable trade off vs. the
expense of having a true failover server.
"D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
news:BL59g.1916$uM4.1877@.tornado.socal.rr.com...
> Two questions:
> Can I do this on the production server without causing live problems?
> Should I do this in a script, in the future, after all full database
backups
> (assuming that I don't want to roll back prior to the full backup moment)?
>
> "SQL" <denis.gobo@.gmail.com> wrote in message
> news:1147463618.444554.51680@.i40g2000cwc.googlegroups.com...
> > backup log with TRUNCATE_ONLY will release space, however you still
> > need to issue a SHRINK command after that
> >
> > Denis the SQL Menace
> > http://sqlservercode.blogspot.com/
> >
>|||Would you mind posting the script that is running to back up your log file?
"Ron Hinds" <__ron__dontspamme@.wedontlikespam_garageiq.com> wrote in message
news:uLFGIAgdGHA.4276@.TK2MSFTNGP03.phx.gbl...
> My database is similar in size to yours (actually a little larger). I have
> SQL Agent backing up the log every 15 minutes with full logging on. This
> keeps the log file from growing too large. Also, in the event of a
> failure,
> I am able to restore to the point 15 minutes (sometimes 30) before the
> failure. We (OK my boss) decided this was an acceptable trade off vs. the
> expense of having a true failover server.
> "D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
> news:BL59g.1916$uM4.1877@.tornado.socal.rr.com...
>> Two questions:
>> Can I do this on the production server without causing live problems?
>> Should I do this in a script, in the future, after all full database
> backups
>> (assuming that I don't want to roll back prior to the full backup
>> moment)?
>>
>> "SQL" <denis.gobo@.gmail.com> wrote in message
>> news:1147463618.444554.51680@.i40g2000cwc.googlegroups.com...
>> > backup log with TRUNCATE_ONLY will release space, however you still
>> > need to issue a SHRINK command after that
>> >
>> > Denis the SQL Menace
>> > http://sqlservercode.blogspot.com/
>> >
>>
>|||It won't make much sense. I used Enterprise Manager and created a DB
Maintenance Plan to do the job. So the script looks like:
EXECUTE master.dbo.xp_sqlmaint N'-PlanID
8797A992-C063-43C1-AAC4-5C0302531C89 -WriteHistory -VrfyBackup -BkUpMedia
DISK -BkUpLog "C:\USB_Drive\SQL_Backup" -DelBkUps 4DAYS -BkExt "TRN"'
Using a maintenance plan is pretty easy for something like this that isn't
too complicated.
"D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
news:6369g.1950$uM4.783@.tornado.socal.rr.com...
> Would you mind posting the script that is running to back up your log
file?
>
> "Ron Hinds" <__ron__dontspamme@.wedontlikespam_garageiq.com> wrote in
message
> news:uLFGIAgdGHA.4276@.TK2MSFTNGP03.phx.gbl...
> > My database is similar in size to yours (actually a little larger). I
have
> > SQL Agent backing up the log every 15 minutes with full logging on. This
> > keeps the log file from growing too large. Also, in the event of a
> > failure,
> > I am able to restore to the point 15 minutes (sometimes 30) before the
> > failure. We (OK my boss) decided this was an acceptable trade off vs.
the
> > expense of having a true failover server.
> >
> > "D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
> > news:BL59g.1916$uM4.1877@.tornado.socal.rr.com...
> >> Two questions:
> >>
> >> Can I do this on the production server without causing live problems?
> >>
> >> Should I do this in a script, in the future, after all full database
> > backups
> >> (assuming that I don't want to roll back prior to the full backup
> >> moment)?
> >>
> >>
> >> "SQL" <denis.gobo@.gmail.com> wrote in message
> >> news:1147463618.444554.51680@.i40g2000cwc.googlegroups.com...
> >> > backup log with TRUNCATE_ONLY will release space, however you still
> >> > need to issue a SHRINK command after that
> >> >
> >> > Denis the SQL Menace
> >> > http://sqlservercode.blogspot.com/
> >> >
> >>
> >>
> >
> >
>|||thank you.
"Ron Hinds" <__ron__dontspamme@.wedontlikespam_garageiq.com> wrote in message
news:eRHv8VgdGHA.564@.TK2MSFTNGP02.phx.gbl...
> It won't make much sense. I used Enterprise Manager and created a DB
> Maintenance Plan to do the job. So the script looks like:
> EXECUTE master.dbo.xp_sqlmaint N'-PlanID
> 8797A992-C063-43C1-AAC4-5C0302531C89 -WriteHistory -VrfyBackup -BkUpMedia
> DISK -BkUpLog "C:\USB_Drive\SQL_Backup" -DelBkUps 4DAYS -BkExt "TRN"'
> Using a maintenance plan is pretty easy for something like this that isn't
> too complicated.
> "D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
> news:6369g.1950$uM4.783@.tornado.socal.rr.com...
>> Would you mind posting the script that is running to back up your log
> file?
>>
>> "Ron Hinds" <__ron__dontspamme@.wedontlikespam_garageiq.com> wrote in
> message
>> news:uLFGIAgdGHA.4276@.TK2MSFTNGP03.phx.gbl...
>> > My database is similar in size to yours (actually a little larger). I
> have
>> > SQL Agent backing up the log every 15 minutes with full logging on.
>> > This
>> > keeps the log file from growing too large. Also, in the event of a
>> > failure,
>> > I am able to restore to the point 15 minutes (sometimes 30) before the
>> > failure. We (OK my boss) decided this was an acceptable trade off vs.
> the
>> > expense of having a true failover server.
>> >
>> > "D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
>> > news:BL59g.1916$uM4.1877@.tornado.socal.rr.com...
>> >> Two questions:
>> >>
>> >> Can I do this on the production server without causing live problems?
>> >>
>> >> Should I do this in a script, in the future, after all full database
>> > backups
>> >> (assuming that I don't want to roll back prior to the full backup
>> >> moment)?
>> >>
>> >>
>> >> "SQL" <denis.gobo@.gmail.com> wrote in message
>> >> news:1147463618.444554.51680@.i40g2000cwc.googlegroups.com...
>> >> > backup log with TRUNCATE_ONLY will release space, however you still
>> >> > need to issue a SHRINK command after that
>> >> >
>> >> > Denis the SQL Menace
>> >> > http://sqlservercode.blogspot.com/
>> >> >
>> >>
>> >>
>> >
>> >
>>
>|||Another way to do it is to "detach" the database and then "attach" it back.
This can be done throught Enterprise Manager. Usually one does that as part
of moving a database, but if you just reattach from the same place, it works
fine and shrinks the log file. We went from 10 GB to under 1 MB. I had
experimented with shrink and truncate before and never could get it quite
right, but the detach/attach worked fine. Alternatively, you can use Query
Analyzer and run sp_detach_db and sp_attach_db.|||"D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
news:NB59g.1899$uM4.1049@.tornado.socal.rr.com...
> I have not been backing up the log. Is that the problem?
Yes, that is exactly the problem.
> When you do a log
> backup, does that automatically then drop the file back to 1 MB again?
Not quite, it removes all complete transactions. If you have transactions
that aren't complete, they'll stay in the log until the next backup.
> A
> command example that I can type in the query window would greatly help me.
backup log <dbname> to disk='f:\log_backups\<dbname>_tlog_200605121724.trn'
But your best bet is to schedule a daily or more often backup.
(we do ours every 20 minutes for example so that we're never more than 20
minutes "out of date" in the event of disaster recovery.)
>
> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
> news:1CAE345F-A36B-4840-B190-495705186E06@.microsoft.com...
> >
> > It is normal for the transaction log file to grow. Are you doing regular
> > transaction log backups?
> >
> > Try that first (backup and shrink) and let us know what happen.
> >
> > Ben Nevarez, MCDBA, OCP
> > Database Administrator
> >
> >
> > "D. Patrick" wrote:
> >
> >> Thanks for the reply, but I don't believe that solves the underlying
> >> problem. Why is it growing? What should look at?
> >>
> >> And BOL says that shrinking the file may not actually shrink the
physical
> >> file. I am very confused and would appreciate any information or
> >> step-by-step instructions. Thank you.
> >>
> >>
> >> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
> >> news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...
> >> >
> >> > Do a backup of the transaction log and then shrink the file.
> >> >
> >> > Ben Nevarez, MCDBA, OCP
> >> > Database Administrator
> >> >
> >> >
> >> > "D. Patrick" wrote:
> >> >
> >> >> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I
am
> >> >> seeing this multiple times each day:
> >> >>
> >> >> ---
> >> >> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was
> >> >> cancelled
> >> >> by
> >> >> user or timed out after __ milliseconds. Use ALTER DATABASE to set a
> >> >> smaller
> >> >> FileGrowth value for this file or to explicitly set a new file size.
> >> >> ---
> >> >> (file growth = 10%)
> >> >>
> >> >> At the same time, applications are experiencing timeouts on
> >> >> reads/writes,
> >> >> which is a major problem, probably (I'm guessing) because this is
> >> >> causing
> >> >> a
> >> >> database lock. This particular DB is about 3 GB and the log file is
> >> >> about
> >> >> 30 GB. The log file was only 1 mb before I did the upgrade. I
have
> >> >> no
> >> >> idea why this is happening. I am not using replication or log
> >> >> shipping
> >> >> or
> >> >> anything like that.
> >> >>
> >> >> a) what gets stored in the log file?
> >> >> b) why is mine growing out of control?
> >> >> c) how do I stop the growth?
> >> >> d) how do I get the log file back to a small size again?
> >> >>
> >> >>
> >> >>
> >>
> >>
> >>
>|||"D Shirley" <D Shirley@.discussions.microsoft.com> wrote in message
news:E07A48F1-468A-407E-8FCD-885A72C51297@.microsoft.com...
> Another way to do it is to "detach" the database and then "attach" it
back.
> This can be done throught Enterprise Manager. Usually one does that as
part
> of moving a database, but if you just reattach from the same place, it
works
> fine and shrinks the log file.
Umm, this should NOT work that way unless you're deleting the logfile.
Don't do that. it may work and you could end up with a ruined database.
> We went from 10 GB to under 1 MB. I had
> experimented with shrink and truncate before and never could get it quite
> right, but the detach/attach worked fine. Alternatively, you can use
Query
> Analyzer and run sp_detach_db and sp_attach_db.
>
Also, do NOT shrink and truncate unless you really really don't care about
disaster recovery. (or have very simple needs.)
Shrinking can lead to a fragmented file on the disk and truncating means
you've limited what you can recover.

My log file is growing huge! Why? Help ASAP! please

Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
seeing this multiple times each day:
---
Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled by
user or timed out after __ milliseconds. Use ALTER DATABASE to set a smaller
FileGrowth value for this file or to explicitly set a new file size.
---
(file growth = 10%)
At the same time, applications are experiencing timeouts on reads/writes,
which is a major problem, probably (I'm guessing) because this is causing a
database lock. This particular DB is about 3 GB and the log file is about
30 GB. The log file was only 1 mb before I did the upgrade. I have no
idea why this is happening. I am not using replication or log shipping or
anything like that.
a) what gets stored in the log file?
b) why is mine growing out of control?
c) how do I stop the growth?
d) how do I get the log file back to a small size again?Do a backup of the transaction log and then shrink the file.
Ben Nevarez, MCDBA, OCP
Database Administrator
"D. Patrick" wrote:

> Since upgrading to 2005 from 2000, in my sql event log, for 2005, I am
> seeing this multiple times each day:
> ---
> Autogrow of file 'MyDatabase_Log' in database 'MyDatabase' was cancelled b
y
> user or timed out after __ milliseconds. Use ALTER DATABASE to set a small
er
> FileGrowth value for this file or to explicitly set a new file size.
> ---
> (file growth = 10%)
> At the same time, applications are experiencing timeouts on reads/writes,
> which is a major problem, probably (I'm guessing) because this is causing
a
> database lock. This particular DB is about 3 GB and the log file is about
> 30 GB. The log file was only 1 mb before I did the upgrade. I have no
> idea why this is happening. I am not using replication or log shipping o
r
> anything like that.
> a) what gets stored in the log file?
> b) why is mine growing out of control?
> c) how do I stop the growth?
> d) how do I get the log file back to a small size again?
>
>|||Thanks for the reply, but I don't believe that solves the underlying
problem. Why is it growing? What should look at?
And BOL says that shrinking the file may not actually shrink the physical
file. I am very confused and would appreciate any information or
step-by-step instructions. Thank you.
"Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...[vbcol=seagreen]
> Do a backup of the transaction log and then shrink the file.
> Ben Nevarez, MCDBA, OCP
> Database Administrator
>
> "D. Patrick" wrote:
>|||It is normal for the transaction log file to grow. Are you doing regular
transaction log backups?
Try that first (backup and shrink) and let us know what happen.
Ben Nevarez, MCDBA, OCP
Database Administrator
"D. Patrick" wrote:

> Thanks for the reply, but I don't believe that solves the underlying
> problem. Why is it growing? What should look at?
> And BOL says that shrinking the file may not actually shrink the physical
> file. I am very confused and would appreciate any information or
> step-by-step instructions. Thank you.
>
> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
> news:C306E3FF-132D-4E8C-A150-078DB9A2DB2D@.microsoft.com...
>
>|||I have not been backing up the log. Is that the problem? When you do a log
backup, does that automatically then drop the file back to 1 MB again? A
command example that I can type in the query window would greatly help me.
"Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
news:1CAE345F-A36B-4840-B190-495705186E06@.microsoft.com...[vbcol=seagreen]
> It is normal for the transaction log file to grow. Are you doing regular
> transaction log backups?
> Try that first (backup and shrink) and let us know what happen.
> Ben Nevarez, MCDBA, OCP
> Database Administrator
>
> "D. Patrick" wrote:
>|||"D. Patrick" <replywithinthegroup@.thenotreal.com> wrote in message
news:NB59g.1899$uM4.1049@.tornado.socal.rr.com...
> I have not been backing up the log. Is that the problem?
Yes, that is exactly the problem.

> When you do a log
> backup, does that automatically then drop the file back to 1 MB again?
Not quite, it removes all complete transactions. If you have transactions
that aren't complete, they'll stay in the log until the next backup.

> A
> command example that I can type in the query window would greatly help me.
backup log <dbname> to disk='f:\log_backups\<dbname>_tlog_200605121724.trn'
But your best bet is to schedule a daily or more often backup.
(we do ours every 20 minutes for example so that we're never more than 20
minutes "out of date" in the event of disaster recovery.)

>
> "Ben Nevarez" <BenNevarez@.discussions.microsoft.com> wrote in message
> news:1CAE345F-A36B-4840-B190-495705186E06@.microsoft.com...
physical[vbcol=seagreen]
am[vbcol=seagreen]
have[vbcol=seagreen]
>