Tuesday, March 26, 2013

Note Taking With OneNote - On SkyDrive

Incase you didn't know, I'm a huge fan of SkyDrive, a Microsoft based product that let's you store and share document's in the cloud, giving you access to your files no matter where you are.  As such, I have been deemed a member of the SkyDrive Insider's program.  Should you be interested in becoming an insider, learn more about it here: http://sdrv.ms/V0buD7
An opportunity presented itself to me last week that I am rather excited about.  Ashworth College offers a distance education course called "Introduction to Programming Using Visual C#", and I was asked to upgrade their course content to reflect the latest version of the language\IDE, which I am more than happy to do.  What this also presented me with was yet another reason to use SkyDrive, which I wanted to showcase for you today.

Microsoft OneNote is one of my favorite applications, I use it on a daily basis.  Applications like EverNote get a lot of the press, but OneNote is just as robust.  If you never used OneNote, imagine a Microsoft Word Document that has free-form text, so you can jot down thoughts, meeting notes, paste in important URL, or copy in images\screenshots, all without worrying about page breaks and such.  Then, you can separate what the notes are into separate sections, pages and sub-pages, you can have multiple users all edit the same OneNote file, and there is a handy built in search feature that searches all text AND all image text as well (meaning if you screenshot a web page, that text will be "searchable" later on in your OneNote file.  At work, OneNote is handy for writing down notes and thoughts between meetings.  At home, we use it to update our weekly grocery list or even plan vacations.

SkyDrive has some built in applications for usage such as Word, Excel and PowerPoint (yes, you can create Word, Excel and PowerPoint documents on the Web.. as in through your web browser.. as in from any PC you have connected to the Internet.. very cool stuff).  OneNote is also included in SkyDrive, available for free, and also web based... so all of the cool features I mentioned above are available, on the web, for free.  (No strings attached.. which is why I keep using the term "cool" in this post!)

So back to my original story here, I now find myself working on this project for Ashworth.  Using SkyDrive, I was able to create a new OneNote document and begin editing and note taking on the changes I had to make to the course.  When I take a lunch break at work, or decide to stay at the office at the end of the day, I don't have to worry about where my note files are, they're all in the cloud and available to me when needed.  I've been able to make my notes and remain productive at home, at work, or when I'm traveling and only have my iPad handy...  Yet another awesome feature of SkyDrive.

For more help using OneNote in SkyDrive, the Microsoft Office team has a great post already assembled, so go check it out.. now! 

Thursday, March 21, 2013

Sheet of Intregrity - 2013

I caved.  I watched no more than 10 minutes total of College Basketball this season, yet I had to complete a bracket.. how can you not?  Of course, looking at my picks, I think they can't be more perfect than they are at this moment.  However, 5 minutes into the Oregon\Ok. St. game, I'll be sick to my stomach for not listening to every prognosticator who said you would be a fool to not think the Ducks are ranked way too low and are prime for a solid tourney run.  Oh well.. it's always fun to complete a bracket, if anything for bragging rights :)  I followed my usual strategy (which never works for me, why do I always do the same thing?) of going to a sports website - this year was si.com - and viewing the top 5 "expert brackets", and choosing the majority pick.. with the exception of a few picks.  Most notably is Davidson in a first round upset (oh my goodness, did I really pick them to make it to the sweet 16?), Michigan making it to the Elite 8, and Wisconsin over Gonzaga.  My reasoning.. I have none.. remember how I said I haven't watched much of any college BB?

Whats with your blog title "Breakin References, Breakin Hearts"

I remember being a kid and listening to my aunt tell several family members how much she enjoyed the movie "The Silence of the Lambs".  I was pretty young at the time (10 or so), so I wasn't very aware of who Anthony Hopkins was, let alone why adults would capture overweight individuals, hold them captive in a pit threathing to spray them with hoses, or why someone would write a movie about it.  Nevertheless, as a child I asked my Aunt why the movie was called what it was called.  I don't remember her answer, I just remember it being relatively vague as she tried to shield my mind from the very things I just discussed... thankfully :)  But as it goes to show, a title can be a fun, insider-like piece of information for those who only experienced the movie, book, etc.  (Incase you never saw The Silence of the Lambs.. a.) you should watch it b.) the title references a scene in the movie where a co-star character comes to grips with their past, and her inability as a child to save lambs from being slaughtered on her uncle's farm).

So what does this have to do with the price of eggs? *

I've seen quite a few people see my blog, read the title, and tilt their heads sideways, similar to the way a dog looks at you incongruously when they sneeze or hoark on a ham-bone.  Yes, the title of my blog is indiscreet.  I'd like to tell you it is a witty, charming line furnished decades ago in history from a past president, scholar or even saint.  It's not, it's half reference to programming, half reference to an inside joke with my wife.  I thought of it when I had 5 free minutes and wanted to start a blog, and for some reason I feel committed to it.  Combing the 2 phrases, I have boiled down the probability of having a person understand the title to 2 people in the world, one of which (my wife) never reads this blog.  Good title choice, eh?

OK, on to the details.  For those of you who dedicated enough time to actually read this far down the page, for one.. I thank you!  For two, "Breakin References" is an allusion to the programming practice of adding a reference to a programming library for tools, features, programmable classes, etc.  These references are dependent on a separate file (such as a .dll file which you may see all over the place on your PC).  When application projects are moved, and these dependent references are not carried over properly, it results in a broken reference error.  The latter half of the phrase is a reference to the phrase "Breaking Hearts", of which Rocky in Rocky I tells Adrian "you look good, you're gonna break hearts".  My wife loved that phrase, and I told her she was going to break hearts once, and she instantly fell in love with me all over again.  And as we got older and had babies, we now say that whenever our daughter is dressed up in a nice dress for a party or whatever, she is going to.. you guessed it.. break hearts.

So there you have it, you can now say that you learned something new today.  It may be useless knowledge.. but the important thing is.. you learned.

Wednesday, March 20, 2013

The SQL Agent service is not running. This operation requires the SQL Agent service. (rsSchedulerNotResponding) - SharePoint 2010

My issue of an expired SSRS server has struck me once again!  In a previous post I discussed some of my perils in inadvertently using a trial mode SQL reporting server.  I thought I had bailed out of all of my issues, but I recently encountered another hurdle.  This time, when attempting to create a shared schedule for subscribed reports, I received the error message "The SQL Agent service is not running. This operation requires the SQL Agent service. (rsSchedulerNotResponding)".

A little bit of research uncovered yet another process\service that I needed to start, but forgot to.  In this case, the issue was solved by simply starting the "SQL Server Agent" service on my reporting server (not on the SharePoint Web Front End's).  Once I did so, I went back to the shared schedule setup page, clicked OK, and all worked as expected!

Anyway, just wanted to offer that tip for anyone else who ran into the issue, it's not the most descriptive error out there, and it took me a few minutes of research in discussion boards before I found the solution.

Happy Reporting!

Thursday, March 14, 2013

Book Review - Moneyball

Moneyball: The Art of Winning an Unfair GameMoneyball: The Art of Winning an Unfair Game by Michael Lewis

My rating: 4 of 5 stars

Moneyball is definitely one of the better books I have read in the past year or so, I found it easy to page through, the chapter's defined nicely, and tells a very interesting story about a new methodology in baseball. The most eye opening experience for me was to realize that the baseball world was (and in some corners, still is) so subjective. Being of a younger generation, and an I.T. guy to boot, the concept of not using data for analysis and improvement is just an odd concept. It makes me realize how so many other aspects of our society still prefer the "eye test" over using actual, objective data when available. It certainly makes you appreciate the technology we have today, and more importantly appreciate author's such as Bill James (a prevalent person in the book) who painstakingly had to manually calculate and write up the statistics we can now have immediate access to, all formattable and sortable in a worksheet!

The only caveat to this book is that you really should be a baseball fan to read it. The added element to this book was knowing the players Michael Lewis refers to. The names Jason Giambi, Jeremy Giambi and Kevin Youkilis are all prevalent in middle chapter's, and to truly appreciate their value to the A's, and to other clubs, it's good to have atleast a basic understanding of these players. I also want to mention that if you are a fan of a team who likes to spend (such as I am, as a Yankees fan), this is a HARD book to read, as it makes you appreciate those teams who build through the draft and scout and study and scrap and improve, knowing they don't have a piggy bank "just in case". That being said, if you're an I.T. fellow, AND a sports\baseball fan.. both of which I am.. read this book.

View all my reviews

Wednesday, March 13, 2013

A data source associated with the report has been disabled. (rsDataSourceDisabled)

The Backstory

Many months ago, I began the process of launching our SharePoint 2010 install for my organization.  In the process, we decided that we can\should migrate over our existing SQL Server Reporting Services reports to now have access via SharePoint.  We felt the best way to do so was to fire up a new server, install SQL reporting services using "SharePoint Integration Mode", and migrate all reports over from our old "Native"  install to use SharePoint instead.  Apparently, along the way I decided to use a trial version of SQL Server 2008.  How do I know?  Because yesterday afternoon, the magic number of 180 days arrived, and around 2 PM yesterday afternoon my SSRS install turned into a proverbial pumpkin.

The best part about this little journey of mine was that it took me quite a bit of time to learn that was the issue.  Every error message I found was vague, indiscreet, and made no mention of SSRS being expired.  I eventually found traces of the SSRS service not running, and when I attempted to start it, it timed out.  From there, I found that the "current version" of SSRS I was running was "Enterprise Trial"... that's when my heart sunk into my briefs.

I eventually got a "real" copy of SQL, installed it, and was on my way.  Upon reconnecting my SSRS server to SharePoint, everything was working except the pulling of my data (which.. to say.. is the most important part).  I started receiving the error message "A data source associated with the report has been disabled. (rsDataSourceDisabled)" - http://www.microsoft.com/products/ee/transform.aspx?evtsrc=microsoft.reportingservices.diagnostics.utilities.errorstrings.resources.strings&evtid=rsdatasourcedisabled&prodname=sql%20server%20reporting%20services&prodver=8.00&lcid=1033

The Solution

The issue, I eventually found out, was that when I upgraded my SQL instance and reconnected SharePoint to SSRS, the data sources stored within SharePoint essentially became "invalid".  I had to visit the shared data sources in SharePoint (these will vary in your SharePoint install, depending on where you store your reports), choose to edit them, re-enter the authentication username and password, and check the box "Enable this data source".  Upon doing so, my reports were working again, baller!

Monday, March 4, 2013

The peril's of not testing properly, and making the largest web page ever in the process

In rooting around some old documentation while at work, I came across a screenshot of a website I had worked on as part of a very large project. 

What is this?  I'll zoom in on the highlighted part of the image, as that's the important part:

Incase you aren't familiar with debugging web application's, this is the size of the web page being generated for a user request.  This indicates that the page size is over 8 MB.  To put that in perspective, if a typical web page should be a slice of pizza, this is like... 4 trays of pizza. 

The back story

Looking at this, I vividly remember where I was when I preserved this.  We were at the tail end of a massive 18 month project to convert this particular company's system and sales processes involving hundreds of employee's and contractor's across the globe.  Each component of the system had large sub-components, all of which had been tweaked and re-tweaked countless times as the project owners and champions continually changed their vision and requirements, while also not relinquishing on their project deadline.  After many cups of coffee and few hours of sleep in the final weeks, many of the developers and system engineer's involved found a way to stay on target, and we were on the eve of the launch, which had cost the parent company 10's of millions to complete.

On this particular night, a majority of those involved in the launch of the project were on a late conference call also involving various V.P.'s and executive's who made more money while sitting on that call than I make in a month.  Those higher up's all wanted a final blessing from everyone publicly that the launch event would occur, that we were prepared to deal with any issues, and that we would be live with all new systems at the end of the weekend.  Of course, no one had the guts to actually "raise their hands" and say we weren't ready.  And to be honest, nobody was comfortable launching that weekend, we all had loads of defect's logged, and for the last month before launch, we all focused solely on issues that were considered "mission critical" (which is another tale entirely, as I had a previous freak-out when someone said the mispelling of the term "the" was "revenue impacting" and therefore a mission critical defect.. but I digress.)

So What Happened?

So, would you believe that, with all the time spent on developing and testing these new systems, we were never able to prepare and test with real, live data until when the actual go-live process was about to un-fold?  And, could you believe that nothing could go wrong with this model?  Of course something did.. actually, there were many hiccups during the launch process.  From a web perspective, it was disheartening that the first of these had occurred within MINUTES of the launch process starting.  It was also scary, because having normal web requests to be this large, it could have crippled our web servers within hours of launching our new systems, looking awful for the team that I was on.

The page in the screenshot above is the output for a course from that particular company, simply by requesting all courses within a 20 mile radius for a zip code.  The end result was a HUGE amount of courses!  And it wasn't just the course name, but a variety of data that was sent down with each course offering.  In the end, hundreds of courses were sent down from the search.
So you're asking yourself.. why didn't you test this?  We did, for months even.  What we DIDN'T test was the actual real-to-life dataset that would launch during go-live.  We were testing a 10th of the real data.  For business reasons, we weren't able to get the actual real-to-life dataset until go live, because it would make too much of an impact for user's to start entering the data until it was time to go live.

And so here we were, on the eve of launch, and while our V.P. of IT was telling very important men and women with several car's, apartment's, computer's and probably a yacht or 2 among the group that all would be OK when all systems were live, our team was discovering a huge, huge issue.

So What Did We Learn?

A few things actually.  The most important thing was that we can change our shorts and think of workarounds at the same time.  In the end the fix was actually simple, enough so that we were able to implement it within an hour or so and get it into Quality Assurance testing that evening - it basically involved reducing the number of courses returned to the screen at a time, making the new page size much, much smaller.

The real lesson however, and the reason for this post, is that project's of all shapes and all sizes need to be tested for appropriately.  When developing a new system, or on top of new hardware, or even performing upgrades, it's so easy to focus on what's new or what's changing.  Those aspects are important of course, but they're not the entire picture, it's not a real test unless you use real-stuff!  For my story above, I can't count how many times we enrolled in fake courses called "blah blah" taking place in zip code "12345".  What we really needed was a real data dump, and trust me, we had asked for that, for months!  We didn't get it, and by the grace of God we we're able to work around our particular issues.  If we didn't, or if the website issue was large enough that we couldn't have it ready in time, it was our team on the hook.. not all the people who failed to provide the accurate data we wanted to test with.