As Paul Randal says, you should not shrink your files.
However, I got into situations that I need to shrink files, for situations like:
in the live database docs and images are in the files, however, we dont need most of these in dev and uat environments, so yeah, I shrink the files because a database of 200GB is much easier to handle than a database over a terabite.
and to check the progress of the restore, progress of the backup, and progress of the shrink operation I have this script:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED
SET NOCOUNT ON
--check the progress of backup, restore or shrinking
--marcello miorelli v.20240901
SELECT
[Server Name] = @@servername
,s.host_name
,TheUser = s.original_login_name
,r.session_id
, r.command
,[Percent Complete] = CONVERT(NUMERIC(6,2), r.percent_complete)
,[Start time] = CONVERT(VARCHAR(30), r.start_time ,113)
,[ETA Completion Time] = CONVERT(VARCHAR(30), DATEADD(ms,r.estimated_completion_time,GetDate()),113)
,sdest.Query
FROM sys.dm_exec_requests r WITH(NOLOCK)
INNER JOIN sys.dm_exec_sessions s WITH(NOLOCK)
ON r.session_id = s.session_id
OUTER APPLY (
SELECT COALESCE((
SELECT TEXT AS [processing-instruction(_)]
FROM sys.dm_exec_sql_text(r.sql_handle)
FOR XML PATH('')
,TYPE
), '') AS Query
FROM sys.dm_exec_sql_text(r.sql_handle)
) sdest
WHERE 1=1
AND r.command IN ('RESTORE DATABASE', 'BACKUP DATABASE','DbccFilesCompact','DbccSpaceReclaim')
this even gives you the script that has been used in the above operations.
it is a helpful tool in any DBAs toolbox.
