Archive
Unix useful commands
- ls -lrt
- Tail -10
- Find the biggest file under current mountpointfind . -name “*” -exec ls -l {} \;|awk ‘{print $5}’|sort
- create directories recursively
mkdir -p /dir1/dir2/dir3 - Copy as well as make a compressed copy.
cp -c /file1 /dir/file2.Z - Kill LOCAL sessions before shutdown
kill -9 `ps -ef|grep LOCAL|awk ‘{print $2}’ - removing files date wise
rm ls -l |grep ‘Jul 18’|awk ‘{print $NF}’
Sql Server useful commands
Coming…
Denodo useful Developer Commands/Info
In this area I will be posting Denodo useful commands/tricks. Below are few,
- context (‘querytimeout’ = )
Example:
context ('querytimeout' = '10800000')
Use above syntax at the end of SELECT statement to avoid default timeout which happens usually after 15 minutes.
- DESC VQL VIEW
Use it to get fields names and its data types
- MAP (,’$.Parent1.ChildElementName’)
Useful in SELECT or WHERE clause to get info from JSON column
- Use CTE’s or TEMP tables for intermediate data processing
- Escape character in Denodo is $. Useful when we search data with functions with REGEX
- Denodo Documentation: Select appropriate version. http://help.denodo.com/platform-docs/5.5/
- Support.Denodo.com : For Denodo customers to create case/ticket for questions/issues.
- DBeaver : I like this client tool because you will get to open multiple query windows and in each window you can run multiple commands at same time.
avoid time-out for Denodo ADO Net Source in SSIS
In SSIS, I was executing sql command in ADO Net Source of one of the Data Flow and was getting Time-out error after approximately 15 minutes. I used below clause after select statement.
context (‘querytimeout’ = ‘9000000’)
Here time is in milliseconds (in our case, ‘9000000’ is approx 2.5 hr).
For more info read “CONTEXT CLAUSE” in Denodo documentation in “Advanced VQL Guide”
Performance Tuning in SSIS for Denodo ADO Net Source
Assumption: You have Denodo driver installed on server. (I had used “DenodoODBC Unicode” version 9.03.04.00, when I created User DSN)
Below steps helped me import two million rows (6 columns with mostly int data) data in less than 4 minutes. Earlier it used to take more than one and half hour.
Here trick is to change User DSN settings. :):):)
- Click “Start” and type-in c:\Windows\SysWOW64\odbcad32.exe and hit enter in Search Programs and File to open “ODBC Data Source Administrator”
- In “ODBC Data Source Administrator”, in “User DSN” tab select appropriate Data source from “User Data Sources” list. Click “Configure”.
- Above step will open “ODBS Driver setup” window, Click “Datasource” button
- do below changes in “Advanced Options” window
- on Page 1
- in “Data Type Options”, un-select “Text as LongVarchar”
- in Miscellaneous, change “Chunk Size” from 100 to 75000
- on Page 3
- in “The Use of LIBPQ library”, change selection from “unspecified” to ”yes” , click Apply, Click Ok.
- on Page 1
I hope this helps.
Logging by Triggers in SQL Server
We can alternatively use SQL Server feature CDC (Change Data Capture), but here I am providing one more way of doing it (custom solution).
Db Objects: (two tables and two triggers)
- TargetTableName: Table whose data we want to log
- LogTableName: Table where data will be logged. This table structure will be same as TargetTableName, and additionally this table will have three extra columns namely
- [log_action_type]
- [log_action_date]
- [log_action_user]
- [log_action_hostname]
- tl_TargetTableName: Trigger to log DML changes (Update and Delete only)
- tu_LogTableName: Trigger to avoid Delete or Update operations operations on logging table
use dbname go alter table LogTableName add [log_action_type] [char](1) not null; alter table LogTableName add [log_action_date] [datetime] not null default getdate(); alter table LogTableName add [log_action_user] [varchar](50) null; alter table LogTableName add [log_action_hostname] [varchar](50) default host_name(); go IF OBJECT_ID ('dbo.tu_LogTableName ','TR') IS NOT NULL DROP TRIGGER dbo.tu_LogTableName; GO -- ============================================= -- Author: Umesh Patil -- Create date: 2014/08/26 -- Description: Restirct Delete or Update operations operations on logging table -- ============================================= CREATE TRIGGER dbo.tu_LogTableName ON dbo.LogTableName FOR DELETE,UPDATE AS BEGIN RAISERROR ('Can not perform Delete or Update operations on UserTableName table, either disable/delete the trigger',16,1); ROLLBACK TRANSACTION; RETURN; END GO use dbname go IF OBJECT_ID ('dbo.tl_TargetTableName','TR') IS NOT NULL DROP TRIGGER dbo.tl_TargetTableName; GO -- ============================================= -- Author: Umesh Patil -- Create date: 2014/08/26 -- Description: Log DML changes (Update and Delete only) -- ============================================= CREATE TRIGGER dbo.tl_TargetTableName ON dbo.TargetTableName FOR DELETE,UPDATE AS BEGIN SET NOCOUNT ON; DECLARE @ActionType char(1) IF EXISTS (SELECT 1 FROM deleted) BEGIN IF EXISTS(SELECT 1 FROM inserted) SET @ActionType='U' ELSE SET @ActionType='D' END IF @ActionType='U' OR @ActionType='D' INSERT INTO [dbo].[TargetTableName] ([PrimaryKey_Column] --include all columns ,[log_action_type] ,[log_action_date] ,[log_action_user] ,[log_action_hostname]) select [PrimaryKey_Column] --include all columns ,@ActionType as [log_action_type] ,getdate() as [log_action_date] ,suser_sname() as [log_action_user] ,host_name() as [log_action_hostname] from deleted END GO
Find Non-ASCII character in SQL Server
Here is a simple command to find character.
SELECT NOTES ,patindex('%[' + CHAR(127)+ '-' +CHAR(255)+']%' COLLATE Latin1_General_BIN,NOTES) as [Position] ,substring(NOTES,patindex('%[' + CHAR(127)+ '-' +CHAR(255)+']%' COLLATE Latin1_General_BIN,NOTES),1) as [InvalidCharacter] ,ascii(substring(NOTES,patindex('%[' + CHAR(127)+ '-' +CHAR(255)+']%' COLLATE Latin1_General_BIN,NOTES),1)) as [ASCIICode] FROM tablename (NOLOCK) WHERE Notes LIKE '%[' + CHAR(127)+ '-' +CHAR(255)+']%' COLLATE Latin1_General_100_BIN2
Hope it helps.
Regards,
Umesh