(0 comments, 60 posts)
Over the last several years I've quite literally fallen into jobs requiring different technologies and skills.... none of which I'd seen in my college course. But thanks to a few books and the wonderful world of the search engines... forums and technical blogs... a fella can set out to learn pretty much anything "technology". My first love is ColdFusion.... but I'm pretty much open to learning as many new skills as possible. That's enough for now!
Home page: http://www.cfyves.com
Posts by Yves
This is a little issue that bugged me…. instead of going in the Visual Studio project and processing dimensions in there, I wanted to be able to multi-select my dimensions and reprocess them.
But, multi-select is not something that you can do in SQL Management studio.
So… here’s the (hidden trick)….
- Select the dimension
- Hit F7
- And then you have “Object Explorer Details” open, where you can do a multi-select.
You can also find “Object Explorer Details” in the “View” menu when you select the dimension.
My scenario was this:
Pulling in several million rows of data from a DB2 database system into an Enterprise 2012 SQL Server Analysis Services setup. This data was basically tranasction records keyed on several fields, like Location, job numbers…etc. And this data was from a main system that was not developped or maintained by the devs I work with.
But in the data’s structure there was an element that wasn’t completely rock solid. That was trying to get the most recent transaction of it’s type based on a max date, max time (on the given date) and also a max sequence number.
Problem was on occasion there was no sequence number and even no time records… really making it impossible to accurately get the desired results. This transactional data is used, from what I can tell, as a log that is compiled by a few different “sub programs” or modules within the main application.
This data was accessible through the legacy system, an ancient green screen program, which still worked quite well but was no longer being used. However, in this program when viewing the data you could see record level keys… but these were not accessible by query field.
These were loaded in correct sequence and give the desired results (in my case) with the proper criteria.
We didn’t think, at first, that it was possible to get these record numbers which wouldn’t just fix the current problem but also simplify our update scripts.
After a little research (thanks Google!) I found that it was so very easy.
I found DB2’s RRN() function. All I had to do was pass the table name to the function in the query and the RRN function would return this Relative Record Number.
SELECT RRN(users) AS rID,FirstName,LastName
FETCH FIRST 10 ROWS ONLY
Or using an alias on the table name
SELECT RRN(u) AS rID,FirstName,LastName
FROM users AS u
FETCH FIRST 10 ROWS ONLY
This query would return 10 records (SQL server TOP 10 equivalent) with rID as the Relative Record Number.
Thought I’d add this for future reference.
Working with a column that was defined as a varchar in a database, but it was intended the contain integer values. Whatever the original purpose for this I’m not sure as I’m not the database’s author.
The program that is used to enter data also allows the input of non-numeric characters.
So, in my scenario, this could be a problem when joining onto that column in a query.
Using ISNUMERIC() in this way eliminated all non-numeric data for me.
ISNUMERIC(dbColumn + ‘.0e0’) = 1
So, you use the internet. On your PC/Laptop? Maybe on your phone or tablet?
Think about the things you do on these devices… Twitter? Facebook?
You may be a Netflix subscriber… Maybe you do research.. heck maybe you do business online.
Well, the wonderful thing about the internet is you can find stuff… you can find information, you can find entertainment…. it’s always been wide open for the most part.
This “Net Neutrality” ruling, a fight over net neutrality has been raging for quite a while now, could change this.
The internet is made up of several ISPs (Internet Service Providers)… like Verizon, Bell…etc. Basically, what some of this is about is giving the ISPs some clout, to be able to dictate what goes through their networks… maybe charge you a little more for more access. Like think of dividing up the Internet like a pie…. maybe you’ll have to pay more to access a different slice that they don’t offer (anymore)…
Maybe they will limit your usage on services like Netflix or other streaming services. Here’s an article on Netflix’s recent reaction to the ruling.
Obviously, if you’re an online business, this may limit your potential clientèle and also limit your earning power without paying more. (If paying more is the answer)
So even for huge online businesses like Netflix, Google and many others, this goes directly in the face of their business model. And also flies directly in the face of why the internet has been so valuable and has grown leaps and bounds.
Imagine an ISP who would, as they could, limit usage to Netflix, and then promote their own version of the service. Essentially, if you ever had a business idea of the “open internet” model, your ideas could be halted and dead in the water.
This could and likely would stifle innovation as “access” would be in the control of businesses who had interest in what you can/can’t access.
Also, imagine an ISP filtering off sites as they wish. Bad reviews of their service? Well, maybe you can’t find them on their networks.
Seriously, the internet’s value is in it’s openness.
Greetings. It’s been so long since I’ve even touch this blog that I write this first line with a touch of shame.
Ok… I’m over it now. 😉
I’ve been working heavily with SQL Server 2008 and with DB2 for the last couple of years. I really love my job.
Lately we’ve been working on a data warehouse project which is structured for reporting. This presented a new concept in my lap which at first I didn’t find to hard to comprehend, we I still don’t really, but it threw a wrench into the wheel I’ve been accustomed to.
– Data denormalization –
Writing SQL code for many years, for me, has been about creating relational databases where data is divided up and joined together in queries. Denormalization brings forth the concept of doubling data for the purpose of having it readily available. (Fast)
So the thought process has been different. Writing SQL jobs that will suck in data in the format that we require to build our warehouse (CUBE – Ready). I guess what this method really seems to be is a space vs time trade off. You have much more data, doubled up… but the result is that you can query it very fast.
The cube my co-worker created was really really fast. Slicing and dicing the data at the blink of an eye.
Data is only as good as it’s usability.
I’ve been using Reporting Services a lot. As with many things I learn on a “needs” basis.
Something I’ve done quite a bit of is have to run counts on the data returned by datasets. Something that stumped me early on when I started was how to run a count against data that was returned based on dates.
It was actually nice and simple:
=SUM(IIF(Fields!DateField.Value = Today(),1,0))
In this example what we’re basically doing is checking if the date is equal to “today”, if it is it’s given a value of 1, if not then 0. And of course, it’s summed up. So this would give us the SUM() of dates equal to today.
A variation of this conditional counting (and something I’ve had to do) is to run a comparison like this against a date and a “non date” value.
So in this example our Fields!DateField.Value is actually stored as an integer. So we need to convert the integer to a date to have it count properly.
=SUM(IIF(new DateTime(Fields!WOODDT.Value / 10000, (Fields!WOODDT.Value / 100) mod 100, Fields!WOODDT.Value mod 100) = Today(),1,0))
So in the above example, we are creating a datetime type with our DateTime() function and then comparing it to the Today() value. If the newly created date value (which is again based off of the data returned from the dataset) is equal to today’s date it is given the value of 1, if not then 0. And this is summed up.
I used to have an iPhone3G and had down graded to a cheap pay as you go phone for some time. I just wanted a simpler cheaper phone for texting and calling.
Last summer I decided to get a new smartphone… so I ended up with an HTC Incredible S.
I haven’t completely loaded the phone with apps. I do have certain apps that I use on a regular basis… like the DropBox app, a couple of sports news apps… Hootsuite and Flickr.
One nice little app I found is SMS Backup +
Basically, you can sync this app to backup your text messages to your GMail account. It also creates a label to put this mail in to keep it organised.
This is great if you want to archive messages and also keep your phone a bit cleaner.
I was digging into trying to find custom URLs for Google Plus today and didn’t see any options to do this. I remember back when custom URLs came out in Facebook.
It’s certainly much easier and nicer to link to your social media site with a custom/easy to read URL then some long string of numbers and/or letters that you’ll never ever ever be able to remember.
So… after a quick search I found that there was a little service that seemed to cater to this (Google+ custom URLs). I decided against just using another link shortening site and use this.
So if this is something that interests you, just go over to gplus.to, enter your Google+ ID (the site shows a screen shot of what this means) and the desired nickname/url string… and voila…. if no one’s already chosen it of course!
I’ve used it, and you can find me over here:
As so many people do, I have a Google Adsense account and recently I had made some changes adding some YouTube Monetization to it.
While looking in the Adsense account, which I do on occasion, I noticed a new line that I just didn’t recognize. “Hosted AdSense for Content”.
Turns out that this is actually the views from the YouTube videos. I just found it a little strange that it didn’t indicate YouTube or something similar as the source… but then again, maybe this is used for other types (or will be used) of digital media in future…. thus the “Hosted AdSense for Content” title.