Diesen Artikel finden Sie hier auch in deutscher Sprache.
So maybe you’ve noticed our blog post on deriving metadata from SQL Server into BimlStudio, but you are using BimlExpress and therefore don’t have access to the feature described in there? While it’s true, that BimlExpress doesn’t support the Metadata features of the Biml language, there are similar ways of achieving a flexible metadata model in Biml.
This post shows you, how you can build a model in SQL Server, import it to Biml including derived relationships etc. and use it in a very flexible way.
To get started, we need to set up a sample model in SQL Server first. You can either download the scripts from https://www.solisyon.de/files/SetupDerivedMetadata.sql or scroll to the very end of that page. Although your individual model can and will differ from this one, we suggest you follow along using our example and start tweaking it afterwards!
Pretty much all of our “magic” is happening within a file called INC_MetaData.biml which will be included by some of our actual Biml files.
In the first part, it reads the “meta” schema from a given database using ImportDB and transforms it into a dataset, including relationships, which will also be called “meta”.
This will allow us to navigation through our metadata model by calling meta(“connections”) for example to access the connections-table.
In addition to the logic that imports the dataset, there are two functions included:
1. GetChildList
This function accepts a DataRow-Array as well as a column name. It will return that column of the entire array as a list. In case the array is empty, it will add a dummy value. Using this function, we can pass contents from our metadata as a list to methods like GetDatabaseSchema.
2. GetAnnotations
This function will build an annotation-tag for each column in a DataRow. This basically exposes the whole metadata model from SQL to Biml. If we would for example add a new column to our table meta.Tables (like a WHERE string), the property would immediatly be available within our Biml code without making any changes to it.
<#@ template language="VB" optionexplicit="false"#> <#@ import namespace="System.Data" #> <# connMetadata = "Data Source=.;Initial Catalog=BimlDemo_DerivedMetadata;Provider=SQLNCLI11;Integrated Security=SSPI;" Dim MetaConnection = SchemaManager.CreateConnectionNode("Meta", connMetadata) Dim MetaSchema = MetaConnection.ImportDB("meta",nothing,nothing) Dim Meta as new DataSet for each tbl in MetaSchema.Tablenodes DT = ExternalDataAccess.GetDataTable(connMetaData,"select * from " & tbl.schemaqualifiedname) DT.TableName = tbl.name meta.tables.add(DT) next for each tbl in MetaSchema.Tablenodes for each fk in tbl.Columns.OfType(Of AstTableColumnTableReferenceNode) PK = MetaSchema.TableNodes.Where(function(c) c.Name = fk.foreigntable.Name).First.Columns.Where(Function(c) c.IsUsedInPrimaryKey = true).First.Name meta.relations.add(fk.foreigntable.Name & "_" & tbl.Name, _ meta.Tables(fk.foreigntable.Name).Columns(PK), _ meta.Tables(tbl.name).Columns(fk.Name)) next next #> <#+ Public Function GetChildList (RS as DataRow(),Col as String) as List(of String) Dim LST as new List(of String) For each rec in RS LST.Add(rec(Col)) next If LST.count = 0 Then LST.add ("NONEMPTYSPACER") Return LST end function Function GetAnnotations(tbl as DataRow) as String Dim annostring as String for each col in tbl.Table.Columns annostring = annostring + "<Annotation AnnotationType=""Tag"" Tag=""" + col.caption + """>" + tbl(col) +"</Annotation>" + vbcrlf next return annostring end function #>
As mentioned before, all the actual metadata automation work is happening in INC_MetaData.biml. To make something useful out of that, we’ll still need to build some Biml around it though.
Pretty much all Biml projects start with some kind of environment that defines our connections, databases and schemas. As we have all that sitting in our sample metadata, we can use our new model and just loop through the respective tables.
All the connections will also get an annotation for every column in the DataTable:
<#@ template language="VB" optionexplicit="false" tier="10" #> <#@ include file="INC_Metadata.biml"#> <Biml xmlns="http://schemas.varigence.com/biml.xsd"> <Connections> <# For Each conn In Meta.Tables("Connections").Rows #> <OleDbConnection Name="<#= Conn("Name") #>" ConnectionString="<#= conn("ConnectionString")#>"> <Annotations> <#= GetAnnotations(Conn) #> </Annotations> </OleDbConnection> <# next #> </Connections> <Databases> <# For Each db In Meta.Tables("Databases").Rows #> <Database Name="<#= db("DB")#>" ConnectionName="<#= db("Connection") #>"></Database> <# next #> </Databases> <Schemas> <# For Each schem In Meta.Tables("Schemas").Rows #> <Schema Name="<#= schem("Schema")#>" DatabaseName="<#= schem("DB") #>"></Schema> <# next #> </Schemas> </Biml>
Now, that we have our connections set up, we can loop through them, get the associated tables for each connections (as we’ve set up a foreign key relationship in SQL Server, we can use the GetChildRows function for that), add them to a list and then run GetDatabaseSchema on the source connection with that list as a filter criteria.
For each table, we will query our meta dataset to get all the additional metadata (stored in the Metatbl object).
Depending on whether there are any entries in the Columns-table connected to a table, we will either filter the columns based on that list or return all the columnns.
Last but not least, we will, again, add all the values from that datatable as annotations:
<#@ template language="VB" optionexplicit="false" tier="20" #> <#@ include file="INC_Metadata.biml"#> <Biml xmlns="http://schemas.varigence.com/biml.xsd"> <Tables> <# For Each conn In Meta.Tables("Connections").Select("type = 'Source'") Dim SRCConn as AstDbConnectionNode = RootNode.Connections(conn("Name")) tbls = GetChildList(conn.GetChildRows("Connections_Tables"),"TableName") Dim importResult = SRCConn.GetDatabaseSchema(nothing,tbls,nothing) for each tbl in ImportResult.Tablenodes Metatbl = Meta.Tables("Tables").Select("Connection = '" & conn("name") & "' and schemaname = '" & tbl.schema.name & "' and tablename = '" & tbl.name & "'").First #> <Table Name="<#= Metatbl("ID")#>" SchemaName="<#= Metatbl("TargetSchema")#>"> <Columns> <#if Metatbl.GetChildRows("Tables_Columns").Length = 0 then #> <#= tbl.Columns.GetBiml()#> <# else #> <#= tbl.Columns.Where(Function(c) GetChildList(Metatbl.GetChildRows("Tables_Columns"),"ColumnName").ConvertAll(function(u) u.ToUpper()).IndexOf(c.Name.ToUpper) <> -1).GetBiml()#> <# end if#> </Columns> <Annotations> <#= GetAnnotations(Metatbl) #> </Annotations> </Table> <# next next #> </Tables> </Biml>
Using these 3 files, we end up with a RootNode that consits of connections, databases, schemas and tables so from here on, it is just Biml as usual – we can just loop through the RootNode to create the required schemas and tables in our target database:
<#@ template language="VB" optionexplicit="false" tier="30" #> <Biml xmlns="http://schemas.varigence.com/biml.xsd"> <Packages> <Package Name="01_Create Staging" ConstraintMode="Linear" ProtectionLevel="EncryptSensitiveWithUserKey"> <Tasks> <# for each Schem in RootNode.Schemas #> <ExecuteSQL Name="Create Schema <#= Schem.Connection.Name#>_<#=Schem.Name#>" ConnectionName="<#= RootNode.Connections.Where(function(c) c.GetTag("Type") = "Target").First.Name #>"> <DirectInput> <#=Schem.GetDropAndCreateDdl()#> IF NOT EXISTS (SELECT schema_name FROM information_schema.schemata WHERE schema_name = '<#= Schem.name #>') BEGIN EXEC sp_executesql N'CREATE SCHEMA [<#= Schem.name #>]' END GO </DirectInput> </ExecuteSQL> <# next #> <# for each tbl in RootNode.Tables #> <ExecuteSQL Name="Create Table <#=tbl.Name#>" ConnectionName="<#= RootNode.Connections.Where(function(c) c.GetTag("Type") = "Target").First.Name #>"> <DirectInput> <#=tbl.GetDropAndCreateDdl()#> </DirectInput> </ExecuteSQL> <# next #> </Tasks> </Package> </Packages> </Biml>
And once we’ve created our target environment, we can also create a package to actually populate those tables:
<#@ template tier="40" optionexplicit="false" language="VB" #> <Biml xmlns="http://schemas.varigence.com/biml.xsd"> <Packages> <Package Name="02_Populate Tables" ConstraintMode="Parallel" ProtectionLevel="EncryptSensitiveWithUserKey"> <Tasks> <# for each tbl in RootNode.Tables #> <Container Name="Transfer <#=tbl.Schema.Name#>_<#=tbl.Name#>" ConstraintMode="Linear"> <Tasks> <ExecuteSQL Name="Truncate" ConnectionName="<#= RootNode.Connections.Where(function(c) c.GetTag("Type") = "Target").First.Name #>"> <DirectInput>truncate table <#=tbl.ScopedName#></DirectInput> </ExecuteSQL> <Dataflow Name="Copy <#=tbl.Schema.Name#>_<#=tbl.Name#>"> <Transformations> <OleDbSource Name="Get Rows" ConnectionName="<#= tbl.GetTag("Connection") #>"> <DirectInput>SELECT <#= tbl.GetColumnList() #> FROM [<#=tbl.GetTag("SchemaName")#>].[<#=tbl.GetTag("TableName")#>]</DirectInput> </OleDbSource> <OleDbDestination Name="Set Rows" ConnectionName="<#= RootNode.Connections.Where(function(c) c.GetTag("Type") = "Target").First.Name #>"> <TableOutput TableName="<#=tbl.ScopedName#>" /> </OleDbDestination> </Transformations> </Dataflow> </Tasks> </Container> <# next #> </Tasks> </Package> </Packages> </Biml>
The next logical steps would be to enhance our model to support other datasources than OLEDB or other load patterns than full TRUNCATE and LOAD. Since our import routine is rather universal, that should require very little changes to our core code.
Any questions or comments? We’d love to hear from you at !
Further Information on Biml in English can be found on our English Biml Page.
Happy Biml’ing!
PS: As promised, here’s the T-SQL to set up your environment again, including the sample data:
USE master GO CREATE DATABASE BimlDemo_DerivedMetadata GO USE BimlDemo_DerivedMetadata GO CREATE SCHEMA meta GO CREATE TABLE [meta].[Connections]( [Name] [nvarchar](50) NOT NULL, [ConnectionString] [nvarchar](500) NULL, [Type] [nvarchar](50) NULL, CONSTRAINT [PKC_Connection] PRIMARY KEY CLUSTERED ( [Name] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO CREATE TABLE [meta].[Databases]( [Connection] [nvarchar](50) NOT NULL, [DB] [nvarchar](50) NOT NULL, CONSTRAINT [PKC_Database] PRIMARY KEY CLUSTERED ( [DB] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [meta].[Databases] WITH CHECK ADD CONSTRAINT [FK_Databases_REF_connection] FOREIGN KEY([Connection]) REFERENCES [meta].[Connections] ([Name]) ON DELETE CASCADE GO ALTER TABLE [meta].[Databases] CHECK CONSTRAINT [FK_Databases_REF_connection] GO CREATE TABLE [meta].[Schemas]( [ID] AS (([DB]+'.')+[Schema]) PERSISTED NOT NULL, [DB] [nvarchar](50) NOT NULL, [Schema] [nvarchar](50) NOT NULL, CONSTRAINT [PKC_Schema] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [meta].[Schemas] WITH CHECK ADD CONSTRAINT [FK_schema_REF_Database] FOREIGN KEY([DB]) REFERENCES [meta].[Databases] ([DB]) ON DELETE CASCADE GO CREATE TABLE [meta].[Tables]( [ID] AS ((([Connection]+'_')+([SchemaName]+'_'))+[Tablename]) PERSISTED NOT NULL, [Connection] [nvarchar](50) NOT NULL, [SchemaName] [nvarchar](50) NOT NULL, [TableName] [nvarchar](100) NOT NULL, [TargetSchema] [nvarchar](101) NOT NULL, CONSTRAINT [PKC_Table] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [meta].[Tables] ADD CONSTRAINT [DF_Tables_TargetSchema] DEFAULT (N'BimlDemo_DerivedMetadata.Stage') FOR [TargetSchema] GO ALTER TABLE [meta].[Tables] WITH CHECK ADD CONSTRAINT [FK_tables_REF_Connections] FOREIGN KEY([Connection]) REFERENCES [meta].[Connections] ([Name]) ON DELETE CASCADE GO ALTER TABLE [meta].[Tables] CHECK CONSTRAINT [FK_tables_REF_Connections] GO ALTER TABLE [meta].[Tables] WITH CHECK ADD CONSTRAINT [FK_tables_REF_Schemas] FOREIGN KEY([TargetSchema]) REFERENCES [meta].[Schemas] ([ID]) GO ALTER TABLE [meta].[Tables] CHECK CONSTRAINT [FK_tables_REF_Schemas] GO CREATE TABLE [meta].[Columns]( [ID] AS (([TableID]+'_')+[ColumnName]) PERSISTED NOT NULL, [TableID] [nvarchar](202) NOT NULL, [ColumnName] [nvarchar](50) NOT NULL, CONSTRAINT [PKC_Column] PRIMARY KEY CLUSTERED ( [ID] ASC )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO ALTER TABLE [meta].[Columns] WITH CHECK ADD CONSTRAINT [FK_columns_REF_tables] FOREIGN KEY([TableID]) REFERENCES [meta].[Tables] ([ID]) ON DELETE CASCADE GO ALTER TABLE [meta].[Columns] CHECK CONSTRAINT [FK_columns_REF_tables] GO
INSERT INTO [meta].[Connections] ( [Name] ,[ConnectionString] ,[Type]) VALUES ('AW','Data Source=.;Initial Catalog=AdventureWorks2014;Provider=SQLNCLI11;Integrated Security=SSPI;','Source') INSERT INTO [meta].[Connections] ( [Name] ,[ConnectionString] ,[Type]) VALUES ('Stage','Data Source=.;Initial Catalog=BimlDemo_DerivedMetadata;Provider=SQLNCLI11;Integrated Security=SSPI;','Target') INSERT INTO [meta].[Databases] ([Connection] ,[DB]) VALUES ('Stage' ,'BimlDemo_DerivedMetadata') INSERT INTO [meta].[Schemas] ([DB] ,[Schema]) VALUES ('BimlDemo_DerivedMetadata' ,'stage') INSERT INTO [meta].[Tables] ([Connection] ,[SchemaName] ,[TableName]) VALUES ('AW' ,'Person' ,'Person') INSERT INTO [meta].[Tables] ([Connection] ,[SchemaName] ,[TableName]) VALUES ('AW' ,'Person' ,'PersonPhone') INSERT INTO [meta].[Columns] ([TableID] ,[ColumnName]) VALUES ('AW_Person_PersonPhone' ,'PhoneNumber')
[…] If your company doesn’t want to shell out the cash to buy a license for BimlStudio, Ben also has a version for people using the free BimlExpress tool: […]