Like many many folk out there, I run my DB using a Case Insensitive collation (CS_AI) which is perfect for almost every need.
Except for one I just had where I needed to do an exact string comparison – case, accents – the lot.
Since Sql Server doesn’t allow you to specify the collation on a parameter I needed something what would do the same for me. I played around for a while with a table variable, setting the collation in that but that was way too complex and messy for such a simple need.
Then I came up with this – and it works wonderfully. Simple, straightforward and can be used in procedures, functions, pretty much anywhere.
If Object_Id('Lib.ExactCompare') Is Null Exec sp_ExecuteSql N'Create Function Lib.ExactCompare() Returns Bit As Begin Return Null; End;' Go /************************************************************************** Object: ExactCompare Schema: Lib Created: 2/6/2014 Author: Rachel Ambler Use: Performs an exact comparison of two strings (therefore making it both case and accent sensitive) Notes =========================================================================== Change History Date Initials Details =========================================================================== 2/6/2014 RA Created ***************************************************************************/ Alter Function Lib.ExactCompare ( @Parameter1 NVarChar(MAX) ,@Parameter2 NVarChar(MAX) ) Returns Bit As Begin Declare @Ret Bit = 0 ; If Cast(@Parameter1 As VarBinary(MAX)) = Cast(@Parameter2 As VarBinary(MAX)) Set @Ret = 1; Return @Ret; End Go
Keywords: Case Matching procedure Function Sql Server
Learnt a lot recently. Learnt that writing a package builder for SSIS is a right royal PITA.
Hate for other people to go through what I’ve done and the only other API I’ve seen available wasn’t the easiest thing to follow.
So I created this: SS-eye-S. Why the eye? Well, why not? Easy on the eye perhaps? Eye of Sauron…?
So, license picked, host chosen and et-voila, version A01 (ugly but workable) is now available on CodePlex.
This has been nagging me for quite some time now. I think about it often and it bothers me that I never see this in action. Science Fiction writers just love to employ this as a means to move a story along improbable routes and fascinating incursions but I’m just not buying it.
What am I talking about?
And the way I see it, it ain’t that easy, regardless on how we eventually manage to move in the fourth dimension. Everything I’ve read or watched, from the classics by such luminaries as Ray Bradbury, to the crazy but fun nonsense that was TimeCop (there, that aged me), they all, to a person, miss out one fundamental aspect of Time Travel. Something that will make realistic time travel an extremely dangerous prospect, if not impossible prospect.
Now, I’m not going to get into the physics on how to move through time here – that’s not my beef. I’m sure that one day we’ll do it, I’m just saying that I’m not really sure just on how useful it would be to the common person.
Let’s start off and be honest and admit it, we’re already time travellers. We’re always moving forward in time at a relative speed. No matter what we do, we’re always travelling forwards through time and unless you’re dealing with all the problems that come with either black holes or travelling close the light speed, it’s always the same for you, me and that ugly green slime ball with a ferocious temper and dreams of Universal Domination on Tau Ceti III (which, at a shade less than 12 ly away, is practically a close friend to us all).
Trouble is that, no matter what else we’re doing, we’re always travelling in the other dimensions as well. Not relatively but we are travelling. We ALWAYS travel in time and space.
For our first example, let’s imagine we want to travel 10 seconds into the future. All the SF you read about would have us winking out of existence and winking back, 10 seconds later.
Trouble is, this lump of rock we call Earth, isn’t stationary – it’s rotating (and completes one rotation every 24 hours give or take).
The graphic above shows the effect of 10 seconds of movement. As the planet rotates the fixed point in the 4th valley (from left) essentially moves to the first hill.
So, what does this mean for our erstwhile mad scientist time traveller? It’s only 10 seconds, how bad could it be?
Yeah – he’s dead. He travelled in time ONLY and, as a result he reappeared in the middle of the hill.
OK, the argument might go, why not do this in the air?
Well, perhaps so, 10 seconds in the air might not be so bad. Unless you happen to materialize into a passing eagle.
And it’s not just flying things one has to worry about. The earth does more than rotation, it’s in orbit around the sun. It travels from one side of the sun to the other every six months.
You imagine what it would be like. You travel 6 months in June, and reappear in empty space because earth is now another 180 million miles in the other part of the solar system. Total and abject bummer – talking about sucking vacuum!
Trouble is that space isn’t empty. Whilst you might be fine, you might also reappear in the middle of a passing asteroid – and you thought materializing inside a passing eagle was bad!
But wait! It’s get’s even worse!
Let’s pretend we can map every object in the solar system – every last rock. Apart from the ridiculous amount of storage and processing it would take to map all this (but hey, we’ve cracked time travel, so how hard can that be), there’s one more fly in the galactic ointment.
Ever since the big bang, the universe has been expanding. We don’t realize this because the distance we travel is relative to a fixed point and the amount of time we’re talking about makes our insignificant lifetime almost irrelevant to the growth.
But grow it does (granted there are a few theories that we’re either contracting – at which point the logic still stands, or that we’re not moving at all – but the last theory is a pretty wackadoodle concept that’s yet to gain traction).
There’s our little neck of the wood, the Milky way, travelling in three dimensions simultaneously. So, if we were to travel in time not only do we have to cater for planetary rotation, and the orbit and everything else in our solar system, but now we really need to map out every last single object in the universe and map out its trajectory in an ever expanding infinite universe, just so we can reappear in a safe place.
The space in-between the galaxies for example. There’s not much there, why not take our chances with that?
Space isn’t that empty, even when it is. it can vary between 104 particles per m3 to 1010 particles per m3 . Pretty busy but potentially survivable? Let’s not worry about the atomic structure of deep space and all it still contains at this point and assume we can move passed that (what’s a hydrogen atom between friends?).
If we thought that materializing inside an asteroid, mountain, or giant eagle is bad, then remember this: Even a 0.1 μm particle lodged inside your brain can be the difference between life and death. Put that inside an electronic circuit and you could kill the entire system!
OK, let’s just pretend we can skip past all that and somehow we can dislodge all these things. Congratulations, we’ve now travelled 100 years in the future in deep interstellar space.
Question – how did you get there and how long is it going to take to get back ‘home’?
So the last time I invited Linux Mint over to my house for a little get-together it all ended rather badly (http://random-thunks.com/2010/06/03/my-brief-sojourn-with-linux-mint-is-over/). At the time we weren’t destined to be friends. I found the GUI too raw for my liking and I eventually found most wandering onto Ubuntu.
Well, yeah, that did last either. It did – right until Unity came around and gatecrashed. From that point forwards we were destined for divorce.
So, I ended up putting Windows 7 on the trusty ASIS Netbook and it ran reasonably fast. Right up until I had to load up on the obligatory Antivirus, Firewall and other protection mechanisms any Windows box needs.
It was useable and it was finally one my home domain so that at least made it easier to manage, but then I had to constantly keep it running so the AV and Windows Updates could keep themselves updated. When I didn’t then I’d find I’d spend the next three hours patching having missed out on three months worth of updates.
Then along came two things across my radar that perked up my interest again. Linux Mint 15 XFCE and Centrify.
XFCE is a GUI designed for low powered PCs. My lil’ ASUS 1005 certainly fitted into that category. 2GB RAM and a slow and meager (only 160GB) hard drive. But I still loved the litl’ fellow and I really wanted to give him a new lease of life. XFCE seemed to fit the bill nicely.
But there was still the little matter of utilizing my AD server to ensure I had a centralized method of authentication. The previous solution I’d tried had been a dismal failure and I ended up blowing it away and reinstalling Linux when it messed up so badly that I couldn’t even connect to the local root account. I’d rather hoped things had moved on from the dark days and I was right.
Centrify comes in multiple flavors – including my favorite flavor: Free. Now, granted the free version doesn’t allow you to utilize all aspects of AD (group policy and the like) but that’s fine. I employ very very few policies in my Domain at home and the only one I really care about is the password policy (which pisses the wife off no end!). Centrify had two boxes to tick and it ticked both: Is it free and can I use to it authenticate against my AD servers.
So, now I’m back with Mint and an pretty happy with it. Sure there’s still some aspects that still confuzzle the heebies out of me and the multi-monitor aspects of Linux still seem to be several years behind that of Windows (can’t speak for Mac’s so I’ve no idea how good they are at handling multi-monitor but I’d imagine they’re much better at it than Linux!). That said though, my Asus only has a VGA analog output which tears out my eyeballs when I drop it onto my 25” (also an ASUS) monitor so I only use it very sparingly.
The one thing that I did spend some time on though was the MDM Login Selector that pops up after I’d installed Centrify and this is where a lot of searching around finally eluded me the answer I’d been looking for.
First off, the selector populated itself with all the users in the domain. If one has the full Centrify package then this can be added as a group policy. I’m cheap though so I had to live without. I did however find all I had to do was edit the centrifydc.conf file, hunt down the pam.allow.users section and enter the login id’s (not the account names) of the users I wanted.
Next I wanted pictures. Avatars. Something but the boring faceless faces the face browser displayed. This was significantly harder to locate and it appears there’s a long standing bug that’s not helping the matter.
First off, .face. Each avatar needs to be in a .face file in the users home folder. Apparently not just any sort of image either, but a correctly sized one. Fortunately there’s an app that does that resizing for you: mdmphotosetup.
Run that lil’ baby and click on the existing (non)avatar to hunt down a new one. When you save it’ll resize the image as a .face file. That’s half the issue resolved. Now we’ve the bug to get around.
The issue at heart is that the Face browser doesn’t actually have the right permissions to get at the .face files because the folders permissions prevent it from doing so. I daresay one could fiddle around with the ACL’s on the files to solve this, but at this juncture I was willing to take the second option (which admittedly does compromise security a tish but for my home purposes it wasn’t a problem) which was to make each user’s home directory readable by anyone. One quick chmod 755 of the folder and another chmod 644 of the .face file later and I was finally in business.
WINE was a given. The wife loves her some MS Paint and that needed to be there. WINE provided very nicely, as it did for Evernote and My Life Organized. Have yet to try MovieCollector under WINE but that’s next on my list.
Native apps included the latest Linux preview of Spotify (need my tunes!), FileZilla, TrueCrypt and Google Chrome.
I went a bit mad looking for the best File Manager but came to the conclusion that Dolphin served all my needs – including the ability to set up shortcuts to explore the various shares I had on Windows.
LibreOffice fulfilled the main office duties I had and I did also grab a copy of Scribus for the desktop publishing requirements (true, it ain’t no Publisher, but it’s good enough for me).
CrashPlan went on like a dream (as it always does) and I rounded it off with a side order of DropBox which I sync all my pictures on my phone onto.
Life, by and large, is good once more.
In the case of my current project, Oracle is the host for most of the data I need to process. Two Oracle instances in fact.
The project in question loads up a good chunk of the Oracle databases each night and drops the data onto Sql Server where it is conjoined, consolidated and combined into something that brings the sources together in a cohesive whole (today’s blog post brought to you by the letter C and the number 2).
To achieve this we use SSIS to bring the data across and that’s where Devart’s dbForge utility starts coming in to play.
Now, I could use Toad – I could also shoot myself in the head. God bless Toad but it’s not exactly user friendly (or if it is, it’s tres-picky as to whom it classes as friends). Especially coming from a Sql Server background.
Yes it does an awful amount, but, for my purposes I don’t need an awful amount, I need the right amount.
My needs were simple. The tool must tick all of the following boxes:
- Does it have a simple UI that’s akin to Sql Server Management Studio?
- Is it cheap?
- Does it allow me to store connections?
- Will it allow me to limit my view to just my objects or show me the entire database at the touch of a button?
- Will it create SELECT scripts for me that will allow me to paste into SSIS to fetch the data?
- Does it give me a simple to use grid showing me the specifics of each table, ala Sql Server Management Studio?
- Can I search for columns easily in a manner that allows me to explore the DB when looking for ‘the odd stuff’ (and we have plenty of that in this project)
- Toad succeeds on all but the very two points – it’s neither cheap nor is it simple. Far too many times with Toad I find myself trying to work out which button does what and which editor is for what connection and…Arrgggghhh!
- Sorry – but this project is tough and I don’t need an annoying (and very prone to crashing) tool to get in my way. No, I needed something else.
- Then I came across Devart and their dbForge Studio which, thankfully had a 30 day trial period. Once downloaded (a quick and painless affair) I fired it up and immediately felt at home. Devart had obviously gone down to correct road and developed a tool that can used easily first. It even has dockable windows – something I do use heavily with both Management Studio and Visual Studio.
- Instantly recognizable – here’s a GUI to control a database – somewhere us Sql Server freaks will feel right at home.
- To the left we have the stored connections. Expand each one and you get the familiar breakdown of all the objects in the instance.
Initially the objects are limited to just the ones you own but the rest of the DB is but a click of a button away.
Being fully dockable one can always move the explorer out of the way until needed.
So, already, point one had been fulfilled. Point two is an easy one. $299 give’s you the whole enchilada. And when one deals with Oracle, $299 is cheap!
So, what about everything else? Connections for example, can you store them easily?
Well yes. Rather easily – once one get’s around Oracle’s tnsnames.ora file. I’m not Oracle expert, I know, and so I’ll admit that those things had me stumped for a while (trying to find the right client to install first was a major exercise in brain scratching), but eventually I was able to iron those files out. None of this is, of course, Devarts fault. Nope, the blame there lies with Larry’s boys and girls!
So, having got a properly formatted tnsnames.ora file, the dialog to set up the connection was simple:
As one would hope with an oracle tool, Devart supports multiple Oracle homes and the method of connecting to the server (e.g. as a normal user, SYSDBA or SYSOPER).
Once configured, the data was stored for future use of the tool. Point three, checked.
Scripting selects was incredibly important given that many of the tables and views I’m importing have hundreds of columns and thankfully was blissfully simple as pointing to the table or view in question and asking dbForge to generate the Select script. It’s the little things like this that make life so much easier…
As for looking for objects, I do so like the option on dbForge to do this.
From here one can select where to search (a single Schema, all schemas or just selected schemas) and what to bring back. Anything it does find is clickable to take you to the object in question for more detail. This again, for me, was a lifesaver.
So, with all the checkmarks ticked, I went on to discover what else this program would do for me. To my great delight I found that first off, it did handle Version Control and, most especially for me, TFS! This was a godsend since I do have a few objects I had to create in Oracle that are part of my project but that, up until now, had no easy way to keep track of in TFS.
Something totally unexpected but very much handy is the Master Detail browser. As it sounds this enables you to use the GUI to choose which table you want as a master and what data you wish displayed and what you wish to show as a Detail table. Much of the work I do is keeping the users happy and this often means digging deep into Oracle to show them that the data they’re looking at on Sql Server is indeed the same data as on Oracle. This tool is a terrific way of doing that since it presents the information in a nice easy to see format: Top pane Master, Bottom Pane Detail. Even OUR users get that!
Every decent tool should have Snippets and here dbForge doesn’t disappoint. The snippets manager is just as you’d imagine and allows you to create Snippets with fillable parameters:
Using said snippet – even easier. dbForge supports Intellisense and that includes snippets. Start typing and the Intellisense suggestion box will pop up with options, including the newly created snippet if so desired:
Selecting the snippet populates the editor with the snippet in question and positions the caret exactly where it should be!
There is no doubt I my mind that Devart approached this product in the right way. God love Quest and their Spotlight tool for Sql Server, but I’m sorry, Toad just didn’t cut it for me and I tired – for months and months! Even after nearly a year I’m STILL lost on how to do some things on Toad and which of the myriad of buttons I should (or shouldn’t) press. Yeah Toad has an awful amount in there, but then, that may be it’s biggest issue. It’s just too gosh-darned complicated.
On the other hand, dbForge was right there for me from the get go. Within just a few minutes of installing it I was being productive (and significantly faster than I was with Toad).
All of a sudden, doing work on Oracle no longer seems a scary proposition. Oracle has suddenly turned from being a distant relative to a close friend. Never thought I’d say that.
Doing something different here; I have a lot of products I use on a regular basis that have become somewhat indispensable for my daily life (work or otherwise) so I’ve decided to talk about some of these in a new series of posts I’m calling Product Spotlight.
First up is TeamCompanion for TFS and MS Outlook 2010-2013.
Disclaimer first: IF you don’t use TFS or Outlook, this product is useless to you. If you do use them and have numerous situations where emails and outlook tasks need to be moved into TFS as work items, bugs etc. then this IS the tool for you.
It’s raison d’être is simple – the be a bridge between Outlook and TFS and it performs this task admirably. I have multiple emails coming in daily that either need to become a work item or be attached to one. TeamCompanion makes this job very simple. With a simple click on the ribbon bar I can create a new TFS work item with the email or tasks either embedded or attached. If an email arrives that’s updating requirements on an existing work item then that’s easy too – just search for the item in question either directly or using a TFS based search and it’ll attach the message directly.
In addition, TeamCompanion is great for pulling status reports out. Got a TFS query that would look great as a status update email? No problem, you can simply select your query, even choose which items to include from that query and drop it directly in an email. One minute and you’re done. That sound like too much work? Heck, configure a scheduled query and let TeamCompanion automatically send out emails at the end of the day (or, however often you choose).
TeamCompanion supports a pretty large range of customization – so you can add defaults to created work items based on either hard-coded or meta-data from the Outlook item.
As you’d expect from any decent TFS client, TeamCompanion is a fully-fledged tool. You can perform various sorts of project administration with it – update Areas, Iterations and more.
There are configuration options up the wazoo lurking behind the scenes here. The plethora of options include how to parse incoming emails (looking for TFS artifacts), what to populate when creating work items, reporting and more. So, if you’ve got a customized Work Item then TeamCompanion should be able to handle those extra columns with aplomb.
If I had one complaint, it would be the horribly bland VS2012 monochromatic ribbon bar. It’s bad enough on Visual Studio, but in Office it looks worse. However that’s my only real issue.
Price: $139.00 per user (multi-user discounts available), upgrades start at $69. I have to admit that their upgrade policy up to know has been horribly generous – and version 5 is the first upgrade I’ve had to pay for in many years so I can’t really complain!
So, in order to test this whole 1K string limit in Tableau, I started off by creating a table called BigStrings with one row inside of it:
Id Int Not Null ,OriginalString NVarChar(MAX) Null ,Clob NText Null ,[C1-4000] VarChar(8000) Null ,[C4001-8000] VarChar(8000) Null ,[C8001-12000] VarChar(8000) Null
This table has 4,000 X’s plus 4,000 Y’s and 4,000 Z’s inside the OriginalString column. This column was replicated over to the NText based Clob column. Id was set to 1.
Insert Into Test.BigStrings ( Id ,OriginalString ,Clob ) Values ( 1 ,Replicate(Convert(NVarChar(MAX), 'X'), 4000) +Replicate(Convert(NVarChar(MAX), 'Y'), 4000) +Replicate(Convert(NVarChar(MAX), 'Z'), 4000) ,'' ) ;
Then I placed the Values ‘One’ ‘TWO’ and ‘THREE’ at character positions 2,000, 6,000 and 10,000 respectively.
Update Test.BigStrings Set OriginalString.Write(N' ONE ', 2000, 5) Where Id = 1; Update Test.BigStrings Set OriginalString.Write(N' TWO ', 6000, 5) Where Id = 1; Update Test.BigStrings Set OriginalString.Write(N' THREE ', 10000, 7) Where Id = 1;
Finally I place the first 4,000 characters in the [C1-4000] column, the next 4,000 in [C4001-8000] and the final 4,000 in [C1-8001-12000]
Update Test.BigStrings Set [C1-4000] = SubString(OriginalString, 1, 4000) ,[C4001-8000] = SubString(OriginalString, 4001, 4000) ,[C8001-12000] = SubString(OriginalString, 8001, 4000) Where Id = 1;
I tested the table first by running some simple queries. First, confirmation that the values were in the table:
Select Substring(OriginalString, 1995, 20) As One ,Substring(OriginalString, 5995, 20) As Two ,Substring(OriginalString, 9995, 20) As Three From Test.BigStrings
|XXXXXX ONE XXXXXXXXX||YYYYYY TWO YYYYYYYYY||ZZZZZZ THREE ZZZZZZZ|
Next, searching for some data:
(1 row(s) affected)
(1 row(s) affected)
(0 row(s) affected)
I then created a Data Source in Tableau and pulled the table in.
Due to OriginalString being defined as a NVarChar(MAX), Tableau was unable to locate neither ‘ONE’ ‘TWO’ or ‘THREE’ since it capped the string at 1,024 characters and ‘ONE’ doesn’t start until position 2,000. Tableau was able to locate ‘ONE’ in [C1-4000], ‘TWO’ in [C4001-8000] and ‘THREE’ in [C8001-12000]. It was not able to find anything in the Clob column, because again, Tableau caps anything longer than 8,000 (VarChar) or 4000 (NVarChar) characters to 1,024 characters.
Next I created a calculated column in Tableau which concatenated the [C1-4000], [C4001-8000] and [C8001-12000] columns together and tried searching in there. ‘ONE’ and ‘TWO’ were both located, but not ‘THREE’. This is seemingly because Tableau is Casting the calculation based on the length of the first column (8,000 characters). I confirmed this by changing the length of the three columns to 4,000 characters and at that point Tableau only found ‘ONE’ (since ‘TWO’ is at position 6,000).
Finally I created a calculated column that returned the total length of the last three columns and with the data types set to be a length of 4,000 each, it reported 4,000, at 8,000 each it reported 8,000.
So, at this current juncture, even combining columns in a Tableau report would not gain us anything over and above what I’d have already now since I’d converted the columns from NVarChar(MAX) to VarChar(8000).
OK, so what’s the workaround?
The devil’s in the details, and here, the details are the driver. As instructed by Tableau, we were using the latest and greatest OLEDB driver to connect to Sql Server. OK, I know that MS just pulled the rug from under everyone by announcing they were deprecating OLEDB in favor of ODBC (read: “We lost the standard battle”) but unfortunately, OLEDB is still faster than ODBC. So, until we’ve a bloody good reason to change, our plan was to stay on OLEDB for a while.
Enter “Bloody Good Reason” from Stage Left.
Just for S’s and G’s, I wondered what would happen if I used the ODBC Driver as a Tableau data source, as opposed to the regular Sql Server one. So I selected ODBC and then the Native Client.
And there it was, the answer. With ODBC my NVarChar(MAX) columns came through in all their uncapped glory. No 1K artificial cap, no 8,000 VarChar cap, nope, I had 12,000 characters of searchable goodness.
So, conclusion? Use the ODBC driver if you can until there’s a fix.
Me? I blew away a perfectly excellent Holiday party because of this bug, and I wasn’t best pleased. Still, an answer is an answer and at this juncture, I’ll take it.
Create Table Test.BigStrings ( Id Int Not Null ,OriginalString NVarChar(MAX) Null ,Clob NText Null ,[C1-4000] VarChar(8000) Null ,[C4001-8000] VarChar(8000) Null ,[C8001-12000] VarChar(8000) Null ) ; Go Insert Into Test.BigStrings ( Id ,OriginalString ,Clob ) Values ( 1 ,Replicate(Convert(NVarChar(MAX), 'X'), 4000) +Replicate(Convert(NVarChar(MAX), 'Y'), 4000) +Replicate(Convert(NVarChar(MAX), 'Z'), 4000) ,'' ) ; Select Len(OriginalString) From Test.BigStrings Update Test.BigStrings Set OriginalString.Write(N' ONE ', 2000, 5) Where Id = 1; Update Test.BigStrings Set OriginalString.Write(N' TWO ', 6000, 5) Where Id = 1; Update Test.BigStrings Set OriginalString.Write(N' THREE ', 10000, 7) Where Id = 1; Declare @TextPtr Binary(16) ,@OriginalString NVarChar(MAX); Select @TextPtr = TextPtr(B.Clob), @OriginalString = OriginalString From Test.BigStrings As B; UpdateText Test.BigStrings.Clob @TextPtr 0 0 @OriginalString; Update Test.BigStrings Set [C1-4000] = SubString(OriginalString, 1, 4000) ,[C4001-8000] = SubString(OriginalString, 4001, 4000) ,[C8001-12000] = SubString(OriginalString, 8001, 4000) Where Id = 1; Select Substring(OriginalString, 1995, 20) As One ,Substring(OriginalString, 5995, 20) As Two ,Substring(OriginalString, 9995, 20) As Three From Test.BigStrings Select Id From Test.BigStrings Where OriginalString Like '%TWO%'; Select Id From Test.BigStrings Where [C4001-8000] Like '%TWO%'; Select Id From Test.BigStrings Where [C8001-12000] Like '%TWO%'; --Drop Table Test.BigStrings