picnicerror.net http://picnicerror.net BI, data, MSSQL, AWS and other such stuff Fri, 02 Mar 2018 18:51:36 +0000 en-GB hourly 1 https://wordpress.org/?v=4.9.5 24407933 SQS vs SNS for Lambda Dead Letter Queues http://picnicerror.net/development/aws/sqs-vs-sns-for-lambda-dead-letter-queues-2018-03-02/ http://picnicerror.net/development/aws/sqs-vs-sns-for-lambda-dead-letter-queues-2018-03-02/#respond Fri, 02 Mar 2018 17:35:04 +0000 http://picnicerror.net/?p=1822 Serverless computing and event-driven functions are what it’s all about at the moment.  But what happens when the event trigger fires, and your process then encounters an error?  How do you recover from this given the event has since passed and may never happen again?  This is a common question in AWS when working with their serverless, event-driven Lambda Functions.

Fortunately, AWS lets you define Dead Letter Queues for this very scenario.  This option allows you to designate either an SQS queue or SNS topic as a DLQ, meaning that when your Lambda function fails it will push the incoming event message (and some additional context) onto the specified resource.  If it’s SNS you can send out alerts or trigger other services (maybe even a retry of the same function – although watch out for infinite loops), or any combination of the above, given its fanout nature.  If it’s SQS you can persist the message and process it with another service.

So let’s look at both options in a little more detail.

SQS Dead Letter Queue

Using SQS as a Dead Letter Queue (DLQ) ensures that you have a durable store for failed events that can be monitored (allowing necessary services/individuals to be alerted) and picked up for resolution at your convenience.  This allows you to process failures in bulk, have a defined wait period before re-triggering the original event, or taking some other steps to resolution.

SQS gives you a durable dead letter queue that can be monitored and polled to collect failed events for re-processing or special attention.

The fact that you don’t reprocess the event straight away gives you a little more flexibility around when and how you deal with lambda failures.

Pros

  • Durability: process when you’re ready to deal with the issue, maybe in bulk.
  • Can keep messages for up to 14 days
  • Next to guaranteed delivery

Cons

  • Latency: not event-driven so must be polled.
  • Single-subscriber: Messages will be deleted after being consumed by a subcriber, so it assumes a single process will be taking action on failed messages.

 

SNS Dead Letter Queue

SNS or Simple Notification Service is a key part of AWS’s event-driven offering, letting you process events almost instantaneously and fan-out to multiple subscribers.  It’s a great way to integrate applications in a microservices architecture.  You can also use an SNS Topic as a Dead Letter Queue (DLQ).  This has the benefit of allowing you to instantly take action on failure, whether that be attempting to re-process the message, alert an individual/process, store the event message somewhere for follow up, or any combination/all of the above.

SNS provides an event-driven Dead Letter Queue, enabling you to take immediate action to retry, alert, and/or store the incoming event-message.

The key to the SNS approach is its flexibility in sending messages to multiple subscribers.  it allows you to take some action immediately, while also passing the message to other, more suitable systems where it can be picked up and processed.

Pros

  • Event-driven: An SNS DLQ will trigger actions instantly upon receiving a message.
  • Fan-out: Configuring multiple subscribers allows multiple actions to be taken by different subscribers at the same time.

Cons

  • Non-Durable: SNS doesn’t keep messages for more than an hour.

Best of Both Worlds

A pattern that works rather well, and offers the best of both worlds, is to combine both SNS and SQS as in the diagram above.  By defining an SNS Topic as the DLQ, and having an SQS subscriber attached to the SNS Topic, you can have your durable store in the SQS queue, while also taking instant action.  The only caveat is that if you are re-attempting to process the message and this time it succeeds, you need some way to tell SQS so that you can remove the message from the queue.

Not perfect by any stretch, but it gives a little of the benefit of both.

Summary

There are a huge number of different patterns (and anti-patterns) out there for implementing SQS and SNS, as well as Lamba and event-driven patterns in general.  The two above are just a basic representation that work well in certain scenarios.  I’d be really interest to hear from other people who have worked with serverless/event-driven on AWS and what your opinions are, as well as any patterns you’ve found to be a good way of managing DLQs.

Please leave your comments thoughts below!

 

]]>
http://picnicerror.net/development/aws/sqs-vs-sns-for-lambda-dead-letter-queues-2018-03-02/feed/ 0 1822
Is AWS about to enable Redshift Spectrum with Enhanced VPC Routing? http://picnicerror.net/development/aws/is-aws-about-to-enable-redshift-spectrum-with-enhanced-vpc-routing-2018-01-18/ http://picnicerror.net/development/aws/is-aws-about-to-enable-redshift-spectrum-with-enhanced-vpc-routing-2018-01-18/#respond Thu, 18 Jan 2018 17:46:54 +0000 http://picnicerror.net/?p=1795 AWS is knocking it out of the park at the moment with loads of new services and features coming out every week.  Indeed, it can be hard to keep up with the degree of change.  But, while working on one of our Redshift clusters today we spotted a potential scoop that would remove a key blocker for one extremely useful service, Redshift Spectrum.

Up until now it’s only been possible to use Spectrum if you don’t have Enhanced VPC Routing enabled on your Redshift cluster.  There are so many benefits to using Enhanced VPC Routing (reduced data transfer cost, control, security) that it’s hard to see why anyone wouldn’t be using it, especially if you move data between Redshift and S3 a lot.

But we spotted a new parameter being applied to one of our clusters when we made some maintenance changes to a parameter group.  There’s now a parameter named spectrum_enable_enhanced_vpc_routing showing, which hints that Amazon may be about to remove this crucial limitation.

 

 What is Redshift Spectrum?

Redshift Spectrum is a seriously cool name for what is essentially fluid extra horsepower for your Redshift cluster.  One of the things commonly cited as a drawback for Redshift is the fact that storage is coupled with compute: there’s no way to scale up to more computing power without also scaling storage (and paying for it).  Enter Spectrum.

Redshift Spectrum is an extension to Redshift that allows AWS users to use on-demand Redshift capability to instantly scale compute power in order to query data that is held in S3.  This works by defining external tables in Redshift.  These external tables are essentially metadata telling Redshift that the files in a specific S3 location are structured in a particular way, so that when a user issues a query against the external table, the Redshift query optimiser knows what the data is, and what it looks like.

When you query this external table, Redshift calculates the estimated data volumes, and computing power needed, and allocates some compute resources from a central pool in order to service your query.  This all happens transparently, and ensures that you are temporarily allocated the necessary compute power to process your query in a reasonable timeframe.

Crucially, this answers the compute vs storage complaint and gives Redshift a similar capability to Google’s BigQuery, which had previously been missing.

I’ll delve into Spectrum in more detail in another post, but for now let’s get back to the matter at hand.  In the meantime, why not check out Amazon’s docs on Redshift Spectrum?

 

What is Enhanced VPC Routing?

In AWS you can configure VPCs (Virtual Private Clouds) which allow you to segregate and group resources and control security, data transfer, and all sorts of other things for all manner of reasons.  Crucially though, some centralised AWS services, most importantly S3 (Simple Storage Service) which is the backbone of AWS, live outside your VPCs.  Amazon don’t charge you to put data into AWS (why would they?) but they do charge you to take data out, or to move it around between regions and VPCs.  It also means that traffic between your VPC and S3 has to go over the big bad Internet.

So this becomes important when you have data moving from “VPC-less” (at least in basic terms) services such as S3, and your resources that you’ve configured within a VPC, for example Redshift.  Fortunately, AWS offers Enhanced VPC Routing, which allows you to route traffic between S3 and Redshift through your VPC, meaning you can control all kinds of aspects of this data movement such as DNS, security groups, ACLs, traffic monitoring and loads more.  The advantages are obvious.

Again, I may touch on this in another post so I’ll leave it here for now.  Amazon’s docs on Enhanced VPC Routing and Redshift.

 

Redshift Spectrum and Enhanced VPC Routing

Tucked away in the Spectrum small print, is a line that states “Your cluster can’t have Enhanced VPC Routing enabled.”  This is a major blocker for anyone wanting to use Spectrum with an in-VPC Redshift cluster as it would mean either a new cluster would be required, or turning off Enhanced VPC Routing.

Fortunately, the newly appeared spectrum_enable_enhanced_vpc_routing parameter suggests that this may be about to change.  I’ve not seen anything from Amazon yet to confirm this, but watch this space!

A "spectrum_enable_enhanced_vpc_routing" parameter has appeared in Redshift.

The parameter “spectrum_enable_enhanced_vpc_routing” has suddenly appeared on the Redshift console, hinting that Spectrum may be about to remove a major restriction.

Let me know in the comments below if you’ve seen any more on the topic, or any official comms from AWS.

]]>
http://picnicerror.net/development/aws/is-aws-about-to-enable-redshift-spectrum-with-enhanced-vpc-routing-2018-01-18/feed/ 0 1795
Redshift connectivity officially announced for Power BI Service http://picnicerror.net/data-and-analysis/redshift-connectivity-officially-announced-for-power-bi-service-2017-03-10/ http://picnicerror.net/data-and-analysis/redshift-connectivity-officially-announced-for-power-bi-service-2017-03-10/#respond Fri, 10 Mar 2017 17:42:45 +0000 http://picnicerror.net/?p=1671 Last year, Microsoft added a preview connector enabling Power BI to query Amazon Redshift.  This wasn’t publicised as an “official” data source, and took some steps in order to be able to even see the connector in Power BI Desktop.  Crucially, you could only use this connector in Power BI Desktop, not when workbooks are deployed to the cloud.  Yesterday, Microsoft announced the connector is now available within the Power BI Service, which means that workbooks containing Redshift data connections can now be deployed to the cloud.  I’ve been working a lot with Redshift over the past year or so, and Power BI’s still my go-to data-viz solution, so I’m delighted to see the this announcement, as it means that Redshift-based workbooks can now be shared with others via powerbi.com.

You can read details of the announcement of Redshift for the Power BI Service over on the Power BI blog, I’m not going to replicate it here.

Working with Redshift data in Power BI

As with most Database-type data sources, Power BI offers two query modes: Import or Direct Query.  Import mode allows you to select a number of tables and views from the data source, and then loads all the data from these into Power BI.  That’s fine in a lot of data sources, but when you’re dealing with potentially billions of rows of data like you normally would be in Redshift (or other big data solutions like Google BigQuery, Spark, Snowflake, etc.), this isn’t really an option.  You’re paying for the processing power these solutions offer, so use it.  DirectQuery mode pushes the execution down onto the database.  It allows something like Redshift to use the power of the cluster to execute the query and return the results to the client, in this case Power BI.  This is a very common model found with client tools that support big data repositories.  Tableau, Qlik, Alteryx etc. all support a similar practice under various names.  These queries are issued in real-time, as a user filters and interacts with the visualisation.  There are some limitations to this approach, as outlined here.

The configuration on powerbi.com is still a little involved, and there aren’t direct connectors set up as of yet, but it’s great to see Microsoft weaving Redshift support deeper into Power BI.  Watch this space!

If you’ve used the Redshift connector for Power BI (or any of the other experimental connectors like Impala or Snowflake), let me know in the comments below how your experience has been and what your thoughts are.

 

]]>
http://picnicerror.net/data-and-analysis/redshift-connectivity-officially-announced-for-power-bi-service-2017-03-10/feed/ 0 1671
Amazon Quicksight now in General Availability! http://picnicerror.net/data-and-analysis/amazon-quicksight-now-in-general-availability-2016-11-16/ http://picnicerror.net/data-and-analysis/amazon-quicksight-now-in-general-availability-2016-11-16/#respond Wed, 16 Nov 2016 08:41:45 +0000 http://picnicerror.net/?p=1605 Late last night, Amazon announced that their proprietary AWS data visualisation tool, Quicksight was now generally available in the US and Ireland.  Quicksight aims to be a PowerBI-esque drag and drop visualisation tool, that allows you to access your data from AWS (and other sources) in seconds, regardless of scale.  I’ve had a very quick go this morning, and visualised some data from a modest 1TB Redshift cluster after just a few minutes.  The biggest challenge was finding out the correct IP range for Quicksight to enable access to my VPC (Thank you serverfault).

More to follow…In the meantime, try Quicksight for yourself here: https://quicksight.aws.amazon.com

]]>
http://picnicerror.net/data-and-analysis/amazon-quicksight-now-in-general-availability-2016-11-16/feed/ 0 1605
5 Observations from Microsoft Build 2016 http://picnicerror.net/web-technology/5-observations-from-microsoft-build-2016-2016-04-20/ http://picnicerror.net/web-technology/5-observations-from-microsoft-build-2016-2016-04-20/#respond Wed, 20 Apr 2016 17:03:30 +0000 http://picnicerror.net/?p=1465 Microsoft’s Build conference for 2016 took place a couple of weeks ago, and true to form, there were a number of killer announcements and reveals for a number of services, tools, and frameworks, many of which are available today.  Not one to ever really post something when it’s actually relevant, here are a few of the things that jumped out at me from the event.

Natural Language Processing is at the core of Microsoft’s future

Microsoft hasn’t made any secret of their work in the field of Natural Language Processing.  They released Q&A as one of the key features of Power BI, enabling users to query their data and generate visualisations using near to natural language.  Then Cortana came along, using the same NLP algorithms and knowledge-base to enable Windows Phone users to command their mobile with their voice.  Then, as the capabilities advanced, Cortana was introduced to Windows 10, and became a key part of the latest MS Operating system, all working from the same platform, and all that experience garnered throughout the last few years.

“Human language is the new UI” – Microsoft CEO Satya Nadella

During his keynote address, MS CEO Satya Nadella gave a key insight into the company’s direction: “Human language is the new UI,” he said, and voice-controlled “bots are the new apps.”  Microsoft’s vision is for users to interact with bots via natural language, who will then interpret the user commands and relay these to the computer.  “Clunky” web forms and cluttered interfaces will be replaced by a new, simpler way to interact with computers.

It’s a grand vision, and while we’re definitely a long way away from the talking computers of Star Trek, Microsoft has made some excellent strides in this direction, not least with the underlying platform behind Cortana, which has now been released to developers as the Language Understanding Intelligent Service (LUIS).  This is currently in beta (and free to use), and is definitely worth checking out if you haven’t already: https://www.luis.ai/.

 

Microsoft is responsible for the apocalypse

Speaking of bots, I’m sure everyone has already read about the exploits of Tay.  Microsoft’s Twitter-bot, built as a demo and test for their new Bot Framework quickly became famous as “she” learned from other Twitter users, and rapidly degenerated from an innocent, naive teenage girl, into a racist, abusive and downright maniacal representation of all that’s wrong with the human race.  Thank God they didn’t give her access to nuclear launch codes or it would be Skynet all over again!

Microsoft’s Twitter-bot Tay quickly lowered herself to the level of the worst Twitter trolls out there. Not the best advert for the company’s AI capabilities.

In saying that, Microsoft’s new Bot Framework is a very impressive product, and provides the brain to enable developers to create their own bots to easily integrate with lots of different services, such as Skype, email, Slack, and many more.  Taking the chilling vision of Tay’s future out of the equation, the availability of such a framework is an excellent idea, given the number of extremely poor “chat” programs out there.  And integration with other channels means developers can easily add smart chat functionality to anything they build, furthering the case for natural-language human-computer interaction.

 

Bash on Windows

Okay, so this one’s been making a lot of headlines.  There’s a fair bit of skepticism about how good it will be, but having Bash running on Windows 10 brings some really excellent developer utilities to Windows, which previously has been relying on .NET-based copies of core Linux utilities.  Couple this with the recent announcement about SQL Server being launched for Linux, and it’s clear that Microsoft these days is about getting people access to the best things, regardless of their platform of choice.

Power BI leads natively on iOS, SQL Server is coming to Linux, and now Bash on Windows.  Modern-day Microsoft is a fair way away from the MS of old.

 

Cross platform is really here.

The Universal Windows Platform (UWP) has had a lot of coverage lately, some good, some bad.  However, Microsoft is taking a really admirable approach in trying to save developers time (who wants to develop an app across every different form factor) and improve continuity of user experience across devices.  The UWP is a great idea, but the key issue has yet again been the availability of apps on the platform.

So, it’s welcome news that at Build this year, MS announced that they’re providing a mechanism for developers to port their Win32 and .NET-based apps to the UWP.  Many intrepid souls have already shown this working with old school PC games, and other custom apps, but it’s great news and opens up the worlds largest collection of apps (or applications, if you were around before the millennium) to the new portable Windows platform.

And in other great cross-platform news, Continuum, Microsoft’s “convert your Windows Phone to a PC-lite” feature, is adding support for the Xbox One controller, meaning that when you’re away from home you just need to pack your controller and your display adaptor to turn your phone into a portable console that can hook up to a hotel TV.  Sure, you’ll be limited to phone-based games, but stuff like Halo: Spartan Assault should work really well, assuming they add controller support.

 

Project Oxford goes live

And finally, one of (in my opinion), the most impressive projects out of Microsoft Research in recent years, Project Oxford, has graduated to a full release.  The collection of machine learning services offers easy-to-use APIs that allow you to provide image recognition, image-based emotional detection, and even identify a breed of dog via their web-servicified (not a word) machine learning models.  There are a huge number of amazing possibilities with these services, just check out the video below to see what one MS staffer created for his smart glasses:

Project Oxford has been released commercially under the name Cognitive Services, and is available here: https://www.microsoft.com/cognitive-services/en-us/apis.

 

Conclusion

So, in summary, a really exciting and interesting Build conference this year, with a strong focus on creating intelligent software that can really learn from user behaviour, and provide a better, and easier, user experience across all devices.  There are some really grand ideas being thrown about, and I’ll be keeping a close eye on things to see how they progress over the next year.  I really like where Microsoft’s vision is heading, the big question is whether they can deliver on all of the promise, or if they’ll fall short.

 

 

]]>
http://picnicerror.net/web-technology/5-observations-from-microsoft-build-2016-2016-04-20/feed/ 0 1465
Possible downtime – Hosting Migration http://picnicerror.net/site-news/possible-downtime-hosting-migration-2015-09-10/ http://picnicerror.net/site-news/possible-downtime-hosting-migration-2015-09-10/#respond Thu, 10 Sep 2015 13:56:11 +0000 http://picnicerror.net/?p=1433 Yet again I’ve let things slide and haven’t posted in a while.  This one’s nothing exciting.  I’m currently migrating to a new hosting provider, so any weirdness can be attributed to this.

Server migration in progress

I’m hoping to be back soon with some more up to date posts!

]]>
http://picnicerror.net/site-news/possible-downtime-hosting-migration-2015-09-10/feed/ 0 1433
Impressions of Microsoft’s new-look Power BI http://picnicerror.net/development/sql-server/impressions-microsofts-new-look-power-bi-2015-05-25/ http://picnicerror.net/development/sql-server/impressions-microsofts-new-look-power-bi-2015-05-25/#comments Mon, 25 May 2015 06:15:41 +0000 http://picnicerror.net/?p=1401 A couple of months ago, Microsoft’s new-look Power BI Preview rolled out globally.  Ditching the Office 365/Sharepoint Online requirement, the new Power BI is a streamlined, simplified version of the product that attempts to lose some of the bloat and give users a focussed, easy-to-use, self-service BI platform.

So has it worked?

What was Power BI like before?

The original version of Power BI built upon functionality that had been introduced through Excel, SQL Server, and SharePoint over a few years.  Starting with the Power Pivot Excel 2010 add-on, Microsoft introduced Power View to SQL Server 2012/SharePoint 2010, giving the first self-service (Microsoft) BI capability to enterprise (I’m choosing to ignore Report Builder/Performance Point here and focus on the direct roadmap to Power BI).

Shortly afterwards, Microsoft introduced the Power Query add-on for Excel 2010, which brought powerful ETL tools into the stable, allowing power users to gather their own data and mash it together with their in-house systems.

With the launch of Excel 2013, Microsoft integrated Power Pivot, Power Query, and Power View (as well as the new Power Map) into one place, and started connecting the dots between the components, allowing users to shape data in Power Query, model in Power Pivot, and visualise in Power View, all within Excel.  Again, the sharing aspect of this was handled by SharePoint, which allows users to upload their workbooks and share them with their organisation.

SharePoint Online has a lot of additional content that gets in the way

SharePoint Online adds a lot of additional content before you get to a Power BI site

A natural extension of this, Microsoft decided to offer a cloud-based version of the same functionality.  Removing the costly SQL and SharePoint licences (not to mention hardware costs), they offered a “per-user” cloud option using Office 365 and SharePoint Online.  This brought a lot of the same functionality available to enterprise within the reach of small teams/businesses and solo users.  It also allowed the addition of killer new features, such as Q&A, a Natural Language Query function that produces visualisations on the fly in response to users’ semi-plain English language queries (no SQL/DAX/MDX syntax).  Also, the ability to configure Data Management Gateways to on-premise data (and create an OData feed) was a huge addition.

Unfortunately, the drawback with Power BI on Office 365 was the same as with a lot of Microsoft products.  Throwing multiple components together and using existing services like O365/SP Online meant that there was a lot of confusion with end users on how to access all the features.  Some of the Power BI components and/or features were only compatible with certain versions of Excel, and the reliance on O365/SharePoint Online meant you had to dig through a lot of extra content and settings just to get your Power BI site up and running.

Power BI for Office 365 is tucked away in the depths of SharePoint

Power BI sites on Office 365 are nice, but have limited functionality online

SharePoint’s a great bit of kit when used correctly, but when it comes to self-service BI, the original Power BI was miles behind the likes of Tableau and Qlik in terms of simplicity and ease of access.

Goodbye Eric Stoltz, hello Michael J Fox – Enter the new look Power BI.

 

What’s good about the new Power BI?

Microsoft has been very good at listening to their customers recently, asking for user feedback on everything from SQL Server to Xbox.  They agreed that Power BI needed to be as simple as possible for new users, so they set to work removing barriers and creating a streamlined, dedicated service.

 

Ditched SP Online and Office 365

Removing the SharePoint Online/Office 365 requirement in favour of a dedicated site and pricing structure is a massive step.  By stripping out all the bloat, MSFT have created something instantly accessible and cheap.

 

Standalone Power BI Designer

Integrating the Power BI tools into Excel was a very good idea, and fortunately MSFT are continuing to support this.  However, they have also recognised that not everybody has the latest version of Excel.  To this end, they’ve launched a free, standalone Power BI Designer application that offers much of the same functionality.  In fact, the Power BI Designer seems to be basically Power Query, Power Pivot, and Power View, all wrapped into a shiny new UI.

The new standalone Power BI Designer application

The designer creates a proprietary file format (.pbix), which can be uploaded directly to the Power BI website.

 

HTML5

Silverlight was a massive, massive drawback to Power View, especially considering Google’s withdrawal of support for the NPAPI protocol and removal of Silverlight support from Chrome in September 2015.  Silverlight was also a massive barrier to getting Power BI running on mobile and Internet of Things (IoT) devices.

To their credit, MSFT had experimented with HTML5 in Power BI on Office 365, but it was never anything more than a beta.

With the new Power BI, MSFT have fully embraced HTML5, which opens up huge possibilities for accessing data regardless of location.  Which leads me nicely on to…

 

Mobile app

In a rather bold move, MSFT has launched a Power BI mobile app, leading on iOS.  Eschewing their own Windows Phone platform, MSFT listened to their customers, who said that they would be most likely to have Apple devices.  The mobile app is available now, and allows viewing of all up-to-date dashboards and reports, and pin any visualisations across all their content to the homepage as a favourite.

Power BI iOS app

There are plans to support more platforms, but by launching with iOS first, MSFT has really let their head rule their heart and made a smart decision.

 

User Measure/Field creation

One of the drawbacks with the previous iteration of Power View (in Excel/SharePoint) is that you can’t directly add new measures.  To define new calculated measures or columns, you have to go back to Power Pivot and define them there.

The new Power BI Designer allows users to define new calculated measures and columns directly on the Power View interface, meaning that users can quickly create new fields on the fly using DAX, and play with data types and formatting.  This adds a key function that Tableau and Qlik have been doing for some time.

 

Customisable dashboards

The addition of separation between Reports and Dashboards is a very nice touch.  Previously, a user would build a Power View using the data in their model, and everyone would have to stick with the same Power View with all the visualisations selected by the creator, or edit it themselves.

With the new Power BI, the creator builds a Report containing the data and a number of visualisations.  Subsequent users can then pick and choose which of these visualisations they wish to use by adding them to Dashboards.  Dashboards can mix and match data from multiple sources and Reports, meaning you can combine data from Google Analytics with streamed data from your web logs if you wish!  Oh, and if your data source is stored in the cloud (uploaded Power BI/Excel file, Azure etc), then you can explore the data in Natural Language via Q&A and pin the results to your dashboard.

You can then share Dashboards with other users straight from the UI.

 

More visualisations

This is a very quick point, but MSFT are regularly adding more and more visualisation types to Power BI, meaning its constantly evolving and expanding.

 

Active support and update programme

Finally, one of the most exciting things about the new Power BI is the rate at which it is being updated.  There’s a vocal support forum where MSFT are actively engaging with users and asking them to share what they’d like to see next, vote on previous submissions, and updating users on planned developments.

They’re also making rolling monthly updates to both the main Power BI website, and the Power BI Designer, constantly adding new data sources and functionality.

 

What’s not so good?

Inevitably with a new, immature product, there are some glaring omissions.  The good news is that MSFT is aware of a lot of the drawbacks and appears to be working hard to add new functionality.

 

Limited Data Refresh

The biggest gripe I have currently is the limited data sources that support refresh.  At the time of posting, you can only automatically refresh your data if it resides in an uploaded Excel file that uses Power Query to load cloud-based data, or an on-premise SSAS Tabular model with a gateway configured.  If they can add in other sources for direct refresh from on-premise systems like SQL Server, SSAS Multi-dimensional, and other DB systems, it’ll be much more useful for serious work.

 

Limited Q&A Support

Perhaps unsurprising this one, but the Q&A Natural Language Query functionality can only run on cloud-based data sources, or data that resides in static uploaded Power BI Designer (.pbix) files.

Q&A on Power BI

Support for Q&A is limited to cloud data sources

It would be good to see Q&A support for live data sources like on-premise SSAS Tabular.

 

No Export Function

No matter how nice a web application you build, the inevitable question always comes up: “Can I export this report to Excel?”.  In fact, there’s currently no export functionality in Power BI at all.

Given how great SSRS is at exporting reports, it would be nice to see even a subset of that in Power BI, say export to Excel, PDF, and Image.  Clients always seem to want to email their reports around.

 

Interactive Tiles on Dashboards

The customisable dahboards are really nice.  However, the tiles are static.  By that, I mean that they don’t have nice tooltips when you hover over a data point, and you can’t click to filter the visualisation (okay, that one’s probably fair, given that a dashboard can cover multiple unrelated reports).

I really like the cross-visualisation filtering in Power View though, so I hope MSFT find a way to get this working.

 

Still in Preview

The main drawback with Power BI is simply that it’s an immature product that’s still subject to massive change.  As such, I wouldn’t recommend betting the farm on it instead of something like Tableau, Qlik, or more traditional enterprise BI tools.  However, it’s made an incredibly strong start, and if MSFT continue to listen to their customers and update Power BI as regularly and as quickly as they have been, they could be on to a real winner.

Power BI Data Sources

Power BI supports an ever increasing array of data sources.

 

Summary

The new incarnation of Power BI is looking extremely promising.  MSFT have learned from their missteps with Power BI for Office 365 and have launched a Preview that looks like a serious contender in the cloud/self-service BI space.  The fact that they’re updating constantly and actively engaging with users on their support forum suggests that they’re committed to delivering a product that people really want.  I’d like to see some of this approach make it to on-premise BI too, stripping out the bloat and making things easier for clients to get involved.

Early signs are very encouraging though, and I’ll be keeping tabs on Power BI over the coming months to see how it progresses.

Let me know in the comments below if you’ve tried Power BI, and what your thoughts were.  And get on over to the Support forums at https://support.powerbi.com/forums/265200-power-bi and tell MSFT what you want to see!

]]>
http://picnicerror.net/development/sql-server/impressions-microsofts-new-look-power-bi-2015-05-25/feed/ 1 1401
Merry Christmas and Happy New Year from picnicerror.net! http://picnicerror.net/site-news/merry-christmas-happy-year-picnicerror-net-2014-12-24/ http://picnicerror.net/site-news/merry-christmas-happy-year-picnicerror-net-2014-12-24/#respond Wed, 24 Dec 2014 10:58:16 +0000 http://picnicerror.net/?p=1382 2014 has been a very quiet year on picnicerror.net.  Since moving job in April, things have been rather hectic, which has meant I’ve not had much time to post, despite having lots of good ideas for content.  I’m hoping to pick things back up over the holidays and get some content ready for the new year.  In the meantime though:

Merry Christmas ya filthy animal!

Merry Christmas to all, and here’s to a fantastic 2015!

]]>
http://picnicerror.net/site-news/merry-christmas-happy-year-picnicerror-net-2014-12-24/feed/ 0 1382
Debugging Errors in SSIS Data Quality Services Cleansing Component http://picnicerror.net/development/sql-server/debugging-errors-ssis-data-quality-services-cleansing-component-2014-08-25/ http://picnicerror.net/development/sql-server/debugging-errors-ssis-data-quality-services-cleansing-component-2014-08-25/#respond Mon, 25 Aug 2014 11:00:25 +0000 http://picnicerror.net/?p=1362 As part of Microsoft’s push to include business users in the Business Intelligence space, the addition of Data Quality Services to SQL Server’s feature set opened up the ETL process to the people who, arguably, know the data best.  Integration with SSIS was a great move, meaning that this user control extends to automated processes, further closing the gap between data and the business.

However, like most new components, Data Quality Services has some teething problems, and these can be quite hard to find when debugging your SSIS packages.  Here’s a quick tip that should help solve some of those tricky to find DQS issues.

What is Data Quality Services?

Data Quality Services (DQS) was added in SQL Server 2012 as a feature to allow knowledge base curation for data cleansing and matching rules.  Available in the Business Intelligence and Enterprise SKUs, Data Quality Services works (optionally) in tandem with Master Data Services to allow business users to control and shape how data should be handled and validated for business use.

As well as a standalone Data Quality Client application, Data Quality Services also integrates with SQL Server Integration Services (SSIS) to enable the use of your user-curated cleansing and matching rules during your automated Extract, Transform, Load (ETL) processes.  In effect, this give the very people who understand the data the ability to shape the Transform stage of ETL, something that would traditionally rely on developers and architects to find out this information from users and implement it during build.

Data Quality Services’ Standard Error

Running DQS cleansing and/or matching as part of an SSIS package is extremely handy, but can be difficult to debug, even with the improved logging and management offered by the SSIS Catalogue.

In my packages, whenever there was an issue I found that my SSIS logs always showed an error similar to this one:

Microsoft.Ssdqs.Infra.Exceptions.EntryPointException: The attempt to update or delete a DAO object of type ‘AKnowledgebase’ with id 1000454 has failed because the object is not up to date or is being deleted from the database.

This generally proves to be somewhat of a red herring, and actually seems to be a separate issue with the way that DQS handles an unexpected error.  Rather than falling over gracefully, it seems to cause this additional exception.  More often than not, this would be preceded by a NullReferenceException, which will be very familiar to the .NET developers out there.

 

Data Quality Services Log Files

To get more detail on the actual root cause of the error, you can use Data Quality Services’ built-in log files.  These can provide a more accurate error message and stack trace, allowing you to properly deal with the issue.

Data Quality Client Configuration Panel

Browse to the Log Settings tab in Configuration to change the logging level.

To enable detailed logging, open your Data Quality Client, and go to the Configuration page.  From there, click the Log Settings tab, and change the Cleansing Project drop-down to Debug.

This will enable a more detailed logging level which should capture the full context of any exception.  Then just run your SSIS package, and following failure, browse to your Data Quality Services log folder and check out the DQServerLog.DQS_MAIN file for details.

The default location is:

C:Program FilesMicrosoft SQL ServerMSSQL11.INSTANCENAMEMSSQLLogDQServerLog.DQS_MAIN

 

If you’ve run into trouble with Data Quality Services components in SSIS, or have some additional tips for stability or debugging, please leave a comment below.

 

]]>
http://picnicerror.net/development/sql-server/debugging-errors-ssis-data-quality-services-cleansing-component-2014-08-25/feed/ 0 1362
Windows Phone, meet Cortana (just watch out for Halo spoilers) http://picnicerror.net/web-technology/windows-phone-meet-cortana-just-watch-halo-spoilers-2014-04-29/ http://picnicerror.net/web-technology/windows-phone-meet-cortana-just-watch-halo-spoilers-2014-04-29/#comments Tue, 29 Apr 2014 18:44:28 +0000 http://picnicerror.net/?p=1334 Warning: This post contains spoilers regarding the ending of Halo 4.  You’ve been warned.

 

I’ve been spending some time with a special lady lately.  She’s always with me, and has been helping me organise my life.  She’s not all work though, and has shown that she knows a joke or two, she’s sassy, and she loves to talk about Halo.  I am of course, talking about one of Windows Phone 8.1’s killer features: Cortana.

With Windows Phone 8.1 being released for Developer Preview, those in the US have enjoyed the addition of Cortana to Microsoft’s OS.  However, she’s currently US only.  Fortunately, some intrepid users over at WPCentral.com managed to find out that you can get Cortana in any other country just by editing your phone’s settings.  So, I changed my phone’s settings and viola, there she was.

 

Siriously?

Just to deal with, and debunk the obvious comparison straight off the bat.  Cortana is not Siri.  I’ve been using Cortana for a couple of days now, and she’s already proving to be smarter than Apple’s effort and more in line with what Google have achieved with Google Now.  I’m already setting reminders to do things when I leave work (based on location rather than time), to ask a particular person about something the next time I text or call them, working out the traffic for my commute, and so on.  Given where Microsoft has suggested Cortana will go: learning your habits and predicting what you want before you ask for it.  It’s a long way off still, but the initial impressions are very good.  Cortana is a seriously cool feature, and adds actual value to the OS experience as well, rather than just being a gimmick.  I’ve already stopped using OneNote and now make all my calendar entries through Cortana.

She’s also got a decent sense of humour, and if you ask her to tell you a joke, she’s got a decent sized repertoire.

 

Is Cortana REALLY dead?

There are also some brilliant Halo-related easter eggs to be discovered with Cortana, and more are being unearthed every day.  When I asked her “Are you REALLY dead?” (following the ending of Halo 4), she replied with the excellent “Yeah, but…it was worth it.  The Didact was a jerk.”  Well done, Microsoft.  Well done.  I’ve attached a few more examples in the gallery below.

Click to view slideshow.

The version of Cortana available now on WP8.1 Developer Preview is surely only a taster of what’s to come, but she’s already proving to be a hugely useful addition.  I’m finding that some requests don’t really work due to the US focus of the current build, but I’m sure that once Cortana rolls out to the UK later in the year, she’ll be even more accurate and awesome.  Perhaps she can keep all us WP8.1 phone owners company until Halo 5 comes out.

Are you using Cortana?  Let me know what you think, and anything you’ve asked her in the comments below.

]]>
http://picnicerror.net/web-technology/windows-phone-meet-cortana-just-watch-halo-spoilers-2014-04-29/feed/ 1 1334