Activating SSRS Report Content Types for SharePoint

For information on how to set up a report library and the relevant content types in SharePoint see this previous post – Create A Sharepoint SSRS Report Library

I recently had trouble publishing an SSRS report to SharePoint.  I was unable to find the Report Server content types on the library.  I needed to activate the Report Server Integration Feature in order to be able to add the SSRS content types to the library.  Here is how to do that.

Go to Site Settings

image

Under Site Collection Administration, choose Site collection features

image

Beside Report Server Integration Feature click the Activate button.

image

That’s it, you’re done. Now you will find the Report Server content types listed in the Site Content Types

Moving SharePoint Documents to the File System

You’ll want to read my previous post Moving SharePoint List Attachments to the File System, to get all the details and requirements for setting up and running these SSIS script tasks.

This is an SSIS Package code which will iterate through the document library to get some relevant information about the documents, and then move specified documents from a document library to the file system.

I will just explain the two script tasks steps, as the rest will be specific to your task.

image

Populate SP_ExpenseAttachments Sript Task

This code iterate through the document library to get some relevant information about the documents

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.IO;
using Microsoft.SharePoint;
using System.Data.SqlClient;
using System.Net;

namespace ST_573f63e769424529b4c14ec196d01e4f.csproj
{
    [System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
    public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
    {

        #region VSTA generated code
        enum ScriptResults
        {
            Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
            Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
        };
        #endregion

        /*
        The execution engine calls this method when the task executes.
        To access the object model, use the Dts property. Connections, variables, events,
        and logging features are available as members of the Dts property as shown in the following examples.

        To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
        To post a log entry, call Dts.Log("This is my log text", 999, null);
        To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);

        To use the connections collection use something like the following:
        ConnectionManager cm = Dts.Connections.Add("OLEDB");
        cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";

        Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

        To open Help, press F1.
    */

        public void Main()
        {
            // Read the Library document info and write it to a SQL table

            string SharePointSite = (string)Dts.Variables["SPSite"].Value;
            SPSite mySite = new SPSite(SharePointSite);
            SPWeb myWeb = mySite.OpenWeb();
            SPList myList = myWeb.Lists["ExpenseAttachments"];
            SPDocumentLibrary myLibrary = (SPDocumentLibrary)myList;
            SPListItemCollection collListItems = myLibrary.Items;

            foreach (SPListItem myListItem in collListItems)
           {
               String ItemId = myListItem.ID.ToString();
               String attachmentAbsoluteURL = SharePointSite + "/" + myListItem.File.Url;

                String attachmentname = myListItem.File.Name;

                //Set up SQL Connection

                string sSqlConn = Dts.Variables["SqlConn"].Value.ToString();
                SqlConnection sqlConnection1 = new SqlConnection(sSqlConn);
                SqlCommand cmd = new SqlCommand();
                SqlDataReader reader;
                cmd.CommandType = CommandType.Text;
                cmd.Connection = sqlConnection1;
                sqlConnection1.Open();

                cmd.CommandText = "INSERT INTO SP_ExpenseAttachments (WorkflowName,DocumentLibrarySharePointID,AttachmentName,AttachmentURL) VALUES ('Expense','" + ItemId + "','" + attachmentname + "','" + attachmentAbsoluteURL + "')";

                reader = cmd.ExecuteReader();
                sqlConnection1.Close();

                    }

                    Dts.TaskResult = (int)ScriptResults.Success;
                }
            }
        }
Read Attachment information and move Expense attachments

This code accepts a document id from a variable, populates some relevant information about the document into a SQL table and copies and renames the document to the file system.

using System;
using System.Data;
using Microsoft.SqlServer.Dts.Runtime;
using System.Windows.Forms;
using System.IO;
using Microsoft.SharePoint;
using System.Data.SqlClient;
using System.Net;

namespace ST_573f63e769424529b4c14ec196d01e4f.csproj
{
    [System.AddIn.AddIn("ScriptMain", Version = "1.0", Publisher = "", Description = "")]
    public partial class ScriptMain : Microsoft.SqlServer.Dts.Tasks.ScriptTask.VSTARTScriptObjectModelBase
    {

        #region VSTA generated code
        enum ScriptResults
        {
            Success = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Success,
            Failure = Microsoft.SqlServer.Dts.Runtime.DTSExecResult.Failure
        };
        #endregion

        /*
        The execution engine calls this method when the task executes.
        To access the object model, use the Dts property. Connections, variables, events,
        and logging features are available as members of the Dts property as shown in the following examples.

        To reference a variable, call Dts.Variables["MyCaseSensitiveVariableName"].Value;
        To post a log entry, call Dts.Log("This is my log text", 999, null);
        To fire an event, call Dts.Events.FireInformation(99, "test", "hit the help message", "", 0, true);

        To use the connections collection use something like the following:
        ConnectionManager cm = Dts.Connections.Add("OLEDB");
        cm.ConnectionString = "Data Source=localhost;Initial Catalog=AdventureWorks;Provider=SQLNCLI10;Integrated Security=SSPI;Auto Translate=False;";

        Before returning from this method, set the value of Dts.TaskResult to indicate success or failure.

        To open Help, press F1.
    */

        public void Main()
        {
            // Read the document info and write it to a SQL table

            string SharePointSite = (string)Dts.Variables["SPSite"].Value;
            SPSite mySite = new SPSite(SharePointSite);
            SPWeb myWeb = mySite.OpenWeb();
            SPList myList = myWeb.Lists["ExpenseAttachments"];
            SPDocumentLibrary myLibrary = (SPDocumentLibrary)myList;
            SPListItemCollection collListItems = myLibrary.Items;

            int ItemID = (int)Dts.Variables["ItemID"].Value;
            String sItemID = ItemID.ToString();

            SPListItem myListItem = myList.GetItemById(ItemID);
            String attachmentAbsoluteURL = SharePointSite + "/" + myListItem.File.Url;

                String attachmentname = myListItem.File.Name;

                //Set up SQL Connection

                string sSqlConn = Dts.Variables["SqlConn"].Value.ToString();
                SqlConnection sqlConnection1 = new SqlConnection(sSqlConn);
                SqlCommand cmd = new SqlCommand();
                SqlDataReader reader;
                cmd.CommandType = CommandType.Text;
                cmd.Connection = sqlConnection1;
                sqlConnection1.Open();

                cmd.CommandText = "INSERT INTO SP_Attachments  (WorkflowName, DocumentLibrarySharePointID, AttachmentName, AttachmentURL, Moved, NewFileName) VALUES ('Expense','" + ItemID +"','" + attachmentname + "','" + attachmentAbsoluteURL + "','" + 0 + "','E' + RIGHT('00000000000' + CAST(" + ItemID + " as VARCHAR),11)" + ")";

                reader = cmd.ExecuteReader();
                sqlConnection1.Close();

                string MRI = (string)Dts.Variables["MRI_File_Location"].Value;
                DirectoryInfo dir = new DirectoryInfo(MRI);

                if (dir.Exists)
                {

                    // Create the filename for local storage using 
                    String FileExt = attachmentname.Substring(attachmentname.Length-4);
                    String ItemNum = "00000000000" + sItemID;
                    String ItemName = ItemNum.Substring(sItemID.Length, 11);
                    String FileName = "\E" + ItemName + FileExt;
                    FileInfo file = new FileInfo(dir.FullName + FileName);

                    if (!file.Exists)
                    {
                        if (attachmentAbsoluteURL.Length != 0)
                        {
                            // download the file from SharePoint or Archive file system to local folder 
                            WebClient client = new WebClient();

                            //download the file from SharePoint 

                            client.Credentials = System.Net.CredentialCache.DefaultCredentials;
                            client.DownloadFile(attachmentAbsoluteURL, file.FullName);

                        }
                        //Mark record as Moved
                        sqlConnection1.Open();
                        DateTime Now = DateTime.Now;
                        cmd.CommandText = "UPDATE SP_Attachments SET Moved = 1, Moved_Date = '" + Now + "' WHERE WorkflowName = 'Expense' and DocumentLibrarySharePointID = '" + ItemID + "'";
                        reader = cmd.ExecuteReader();
                        sqlConnection1.Close();

                    }

                    Dts.TaskResult = (int)ScriptResults.Success;
                }
            }
        }
    }

Create a SharePoint SSRS Report Library

For whatever reason this type of library is not out-of-the-box.  I have to set it up manually every time.  Here are the steps for this particular client.  They will be similar for other SharePoint/SSRS set ups.  You can follow these instructions to set up an SSRS Data Source library and an SSRS Share DataSet library.

Document type                        Content Type

Report Services Report       Report Builder Report

Data Source                              Report Data Source

Shared DataSet                       Report Builder Report

1. Create a Document Library in SharePoint.

Go to Libraries.  Click Create.  Call your new library “Reports”.

image

2. Allow Management of Content Types.

Click on your library. Go to Library Settings. Click on Advanced Settings.  Change the radio button for “Allow management of content types” to Yes.  Click OK.

3. Add Report Content type.

In the Library Setting under Content Types click on “Add from existing site content types”.  In this case the client is using Report Builder content types for reporting, which will work fine for Report Services reports.

 

image

From the “Select site content types from” drop down, select “Report Server Content Types”.  Add any content types you would like to maintain in your Reports library.  I prefer to keep data sources and data sets in separate libraries, but some people like to keep them on one library.  Take note that whichever content type floats to the top of your Content Type list will be the Default content type for your library.  This will matter when creating and adding new documents.  Add the default Content Type first, and then any others.  Click OK.

If you don’t see the Report Server Content Types listed in the drop down, you may need to activate them on your Site Collection.  Read this post to find out how to do that – http://thedataqueenblog.azurewebsites.net/2013/05/activating-ssrs-report-content-types-for-sharepoint/

image

 

4. Delete the Document Content Type.

Under Content Types select the “Document” content type.

image

Select “Delete this content type”.

image

This will make the Report Builder Report content type the default content type.

You are ready to deploy SSRS reports to your SharePoint library.

If you find that you are unable to deploy your reports using the deploy feature in Visual Studio – for example Visual Studio keeps asking you for credentials when you try to deploy – you may want to manually upload the reports and data sources to SharePoint.  Read this post to find out how http://thedataqueenblog.azurewebsites.net/2012/07/manually-deploy-ssrs-reports-to-sharepoint/

Data-Driven Subscription Fails but Report Runs Manually

In a continuation of my last blog post on Finding Report Subscription Errors, there was a tricky little reason why the subscription was failing for some of the parameter values, even though the reports could all be run manually.  Just like the Current User Filter in SharePoint, it is CASE SENSITIVE.

I got this error in the trace log:

library!WindowsService_113!308!05/29/2012-10:31:17:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: , Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter ‘Manager’ is not a valid value.;

But all of the values being fed to my subscription were valid values which worked when manually running the report.  Why did some of the reports in the subscription render successfully and others not?  It turns out the underlying query in SSRS for the Manager parameter dropdown had a mix of either lowercase or the first two letters capitalized. The query I was using in my data-driven subscription was all lowercase.  So only those in the report dropdown which were lowercase were running successfully. 

I solved the problem by changing the underlying query feeding the SSRS Report Parameter to lowercase, and ensuring that my subscription query was also lowercase.  This resolved all the errors.

Finding Report Subscription Errors

I had an issue trying to find what was causing a data-driven report subscription error.  The SSRS report is deployed to SharePoint, and Reporting Services is in SharePoint Integrated mode. The subscription was showing last results as “Done: 15 processed of 15 total; 7 errors.” It took awhile to find the pieces I needed to figure out what was causing the error.

I found some information about looking at Report History in SharePoint and creating a New Snapshot, which would give me the last known error.  However, since some of the reports had run successfully this did not work. 

Next I tried looking at the Report Server database in the ExecutionLog tables.  I isolated the query results to just the one report subscription by writing a query like this:

Use ReportServer
select * from ExecutionLog3
where  RequestType = ‘Subscription’ AND timeStart >’2012-05-29 11:30:00.000′
ORDER BY TimeStart DESC

This only returned the successes, not the failures.

Finally, I tried looking in the Report Server Trace Log file.  There was very little in the log file and nothing to do with my subscription.  I knew that the Trace Log file should hold the information I needed.  After much poking around I realized that I had made a fundamental error in my assumption about the architecture. The Trace Log file resides on the SharePoint server, not the Report Services database server. 

Find the Trace Log

The Trace log files can be found on the SharePoint server, usually here:  C:Microsoft SQL ServerMSRS10_50.MSSQLSERVERReporting ServicesLogFiles

Find the ScheduleID for your Subscription

Find the Schedule ID of the most recent subscription by querying the ReportServer database:

select
‘SubnDesc’ = s.Description,
‘SubnOwner’ = us.UserName,
‘LastStatus’ = s.LastStatus,
‘LastRun’ = s.LastRunTime,
‘ReportPath’ = c.Path,
‘ReportModifiedBy’ = uc.UserName,
‘ScheduleId’ = rs.ScheduleId,
‘SubscriptionId’ = s.SubscriptionID
from ReportServer.dbo.Subscriptions s
join ReportServer.dbo.Catalog c on c.ItemID = s.Report_OID
join ReportServer.dbo.ReportSchedule rs on rs.SubscriptionID = s.SubscriptionID
join ReportServer.dbo.Users uc on uc.UserID = c.ModifiedByID
join ReportServer.dbo.Users us on us.UserID = s.OwnerId

Search for the ScheduleID in your Trace Log and find the Error message

Open the appropriate log file based on the time stamp being the most recent after the subscription ran, and search for the ScheduleID.  Once you find the first entry for your ScheduleID, look for anything that starts with e ERROR

The error message can look like this:

library!WindowsService_113!308!05/29/2012-10:31:17:: e ERROR: Throwing Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: , Microsoft.ReportingServices.Diagnostics.Utilities.InvalidReportParameterException: Default value or value provided for the report parameter ‘Manager’ is not a valid value.;

Credits

In preparing this post, I found the following articles to be useful:

http://blogs.msdn.com/b/deanka/archive/2010/02/16/troubleshooting-subscriptions-part-ii-using-the-report-services-trace-log-file.aspx

Populating Date fields in SharePoint using SSIS

This is an addendum to my previous blog post How to use a SharePoint list as a data source in your SSIS package.

If you are populating a SharePoint lists with a field of date type, you need to format the date as text, in the following format, in order for SharePoint to accept the input from SSIS.

‘yyyy-mm-dd’      or for one digit months or days      ‘yyyy-m-d’

So if you’re using a SQL Query you can write something really ugly, like this:

SELECT �
CAST(DATEPART(Year, GETDATE()) AS varchar(4)) + ‘-‘ + CAST(DATEPART(Month, GETDATE()) AS varchar(2)) +  ‘-‘ + CAST(DATEPART(Day, GETDATE()) AS varchar(2))  AS [Adjusment Date]

Trigger an SSIS Package from SharePoint using Nintex Workflow

There are many times when my clients want to be able to refresh data on demand, without having to wait for a nightly data warehouse refresh.  Quite often this occurs when they are updating data in SharePoint lists which get fed into the data warehouse and from there into cubes and reports.  There is a very simple way to allow a user to trigger the data refresh of just their specific piece of data, using Nintex Workflow.  If you don’t know about Nintex, it is a third party product which adds a drag-and-drop workflow designer and advanced workflow features to SharePoint.  It is inexpensive, and a great way to empower users to manage their business processes.

In this example a user has updated a series of SharePoint lists with current vendor scorecard data and would like to see it reflected in the cube and reports.

SSIS PACKAGES BEST PRACTICES

You will need to create an SSIS package which triggers the data refresh desired by the user.  When building my SSIS packages which populate a data warehouse I like to keep it modular, with related actions in individual packages, and then use a master package to trigger the packages in the correct order.  In this way I can trigger smaller jobs simply by creating additional master packages to run the relevant packages which already exist.  This is best practice since I only have to maintain the logic in one package, which can then be used by multiple master packages.  I normally create one package for each of my dimension (attribute) tables and one for each of my fact (transactional) tables wherever possible, unless they are interdependent.  I try to keep each package as modular as possible so I can trigger them in different sequences where necessary.

Using a master package to trigger a sequence of packages has many advantages over doing multiple steps in SQL Server Agent.  I only trigger one master package using a SQL Server Agent Job.  It allows me to keep the order of running a sequence of packages in SSIS where I can add annotations to remind myself why certain packages need to be run before others. This helps me remember the interdependencies when I am making changes.  It also allows me to control the checkpoints, which gives me the ability to rollback multiple steps where necessary.  And it helps me when reviewing the SSIS logs.  I can see how long the master package takes to run from beginning to end, and compare that over time.

CREATE A NEW SSIS MASTER PACKAGE

For this example I have a master package which populates the entire data warehouse and refreshes the data cube at the end.  This is run nightly.   I will add an additional master package, to be triggered by the user, which only refreshed the vendor scorecard data from the SharePoint lists into the data warehouse and refreshes the data cube.  We’ll call it MASTER Vendor Scorecard Refresh.

image

CREATE A SQL SERVER AGENT JOB

Now create a SQL Server Agent Job with one step which triggers the SSIS package you just created.  Do not schedule this job, since it will be triggered manually by the user. 

image

You will want to add a notification to email you when the job completes while you are doing your testing. You could change the notification to when the job fails once you know it’s working correctly.

CREATE A SHAREPOINT LIST AND WORKFLOW

Create a SharePoint list, called Vendor Scorecard Data Refresh.  Add one item to the list.  You could name the item Trigger Vendor Scorecard Data Refresh Workflow.  In your List tool bar select Workflow Settings, Create a Workflow in Nintex Workflow.

image

image

Leave the template as Blank and click the Create button.

image

In the bottom left of your screen choose the workflow action type of Integration.

image

From the resulting actions which appear in the Workflow Actions pane, drag the Execute SQL action onto your workflow.

image

Double click in the center of the Execute SQL icon to configure the action.  Select OLEDB, key in your connection string. 

Data Source=yourserver;Initial Catalog=yourdatabase;Provider=SQLNCLI10.1;Integrated Security=SSPI;

check the box for Use Windows authentication.  Key in the username and password that has permissions to to trigger the SQL Server Agent Job. Enter the query which will trigger the job.

EXEC msdb.dbo.sp_start_job N’Vendor Scorecard Refresh’

 

image

Click on Save to save the action.  Publish the workflow by clicking on the Publish button.

image

TRIGGER THE WORKFLOW MANUALLY

Navigate to your Vendor Scorecard Data Refresh SharePoint list.  Click on the dropdown beside your one item and select Workflows.

image

Select the workflow you want to start.  You can store more than one workflow on a SharePoint list, but this can get confusing to users so I recommend you keep it to one workflow per SharePoint list unless there is a very good reason to do otherwise.

image

Click the start button to trigger the workflow.

image

If you set up notifications on the SQL Server Agent job you will receive an email when the job completes.  The email will tell you if the job was successful. 

image

If you did not set up notifications you can check the history of the job. The history will not appear on the job until the job completes, so you will need to wait a few minutes for the steps in your SSIS package to complete before you can check if it ran successfully. In SSMS right click on the SQL Server Agent Job and select View History.  You will see a green checkmark beside the date and time if the job ran successfully.  If it fails you will see a red X.

image

That’s it, your done.  Users can now trigger their own data updates as required.

How to Use a Current User Filter to filter an SSRS Report Web part in SharePoint (using a list of Available Values)

It is not an uncommon requirement to be able to publish an SSRS report to SharePoint and then use it in a Web part filtered by the user viewing it.  You include the domainuserid in your main report query and filter the query on this value using a report parameter where the user value can by typed in.

However, in many cases the report does double duty as a report stored in a Report library, where users could choose multiple or ALL users.  In that case you want to have a list of Available Values in your parameter so that someone viewing the report in the library can pick from a list of users rather than have to know everyone’s user id.  The tricky part is formatting the domainuserid in your parameter so SharePoint can use it, and adding yourself to the list of available values so you can test that the Web part works as expected.

These are easy to do, but it took me a little bit of time to figure it out, so I’m blogging it here.

CREATE THE LIST OF AVAILABLE VALUES FOR YOUR PARAMETER

The SharePoint Current User Filter expects the domainuserid in the parameter list of available values to be all lowercase.  So when creating the SQL Query for  need to convert them to lower case.  You also want to add yourself to the list for testing purposes, otherwise you will get an error when looking at the Web part in SharePoint.  You can do this using a UNION clause.  Your query for the list of available values should look like this:

SELECT DISTINCT
‘mydomain’ + LOWER(myuserid) AS UserAccount,
myUserName as UserName
FROM myTable

UNION
SELECT
‘ ALL’ AS UserAccount,
‘ ALL’ AS UserName

UNION
SELECT
‘mydomainmydevuserid’ AS UserAccount,
‘mydevusername’ AS UserName

I won’t go into the details here of how to use ALL in your report, but you can read more here http://thedataqueenblog.azurewebsites.net/2011/06/how-to-default-to-all-in-an-ssrs-multi-select-parameter/

Configure your user parameter as follows:

image

image

image

Publish your report to the relevant SharePoint library, and navigate to the library and test that the report and the parameter is working as expected.

ADD THE REPORT TO A WEB PART

Add the report to a web part as you normally would.  You would add a web page, and configure a web part to be a SQL Server Reporting Services Report Viewer.  Edit the web part and navigate to the report you created.  Open the Parameters section and click on the Load Parameters button.  You can leave the parameter default as “Use Report Default Value”.  Click Apply and OK.

ADD THE CURRENT USER FILTER TO A WEB PART

Add a web part and choose the Current User Filter type from the Filters section.  It will say that it is not connected.

CONNECT THE FILTER TO THE SSRS REPORT VIEWER

Go back to your Report Viewer web part and from the drop down choose Connections –> Get Report Parameters From –> Current User Filter

image

A dialog box will pop up where you can choose your User parameter and click on Finish.

image

Check in your changes and view the results.  You will be able to see your web page with the report filtered on your user name.  As a developer, if you do not have any values in this report you should see the report with no values returned, rather than getting an error.  This is because you added your userid to the list of available values at the beginning of this exercise.  You might want to remove yourself from the list once you have tested that the web part is working correctly.

Dynamically filter a Drop Down on your SharePoint list by another User Selected Value

This is one in a series of posts on using InfoPath. If you are new to SharePoint lists and InfoPath you might want to start with the Best Practices.

Best Practices for Cascading SharePoint Lists using InfoPath Forms

Use a SQL Server data source for an InfoPath pick list

Continuing with the same example I used in the Best Practices post, I’m going to make a change to an existing InfoPath form.  Navigate to the SharePoint list called Supply Chain Scorecard Metrics and add a new item. In the Section drop down, notice that I’ve included the Goal as part of the Section description (Goals are Improve, Maintain and Track). 

image

What I could do is have the user choose the goal, and then filter the Section list to show only those Sections which have that Goal.  Here’s how we can do that.

ADD THE GOAL TO YOUR LIST

Navigate to your Supply Chain Scorecard Metrics list and in the List menu click on the List Settings.  Add a column called Goal_ID with type of Number.  Require that the column contains information.  When the user selects a Goal we will store the Goal ID in the Metrics list, and then use that to filter the drop down list of available Sections the user can choose from.

image

ADD THE GOAL TO YOUR FORM

Navigate to your Supply Chain Scorecard Metrics list and in the List menu click on the Customize Form button to open up the InfoPath form.

image

Click OK and enter a password if necessary.  Click Yes when it asks if you want to update the fields from the SharePoint list.

Add a row to your InfoPath form above Section by highlighting the Section row, right click and select Insert –> Rows Above. 

image

From the Fields window, drag the Goal_ID field into the right hand cell in the row you just created.

image

Delete the label that says Goal_ID:, and type in Goal in the cell to the left of the Goal_ID textbox.

image

Right click on the Goal_ID text box and select Change Control –> Drop-Down List Box

image

Right click again and select Drop-Down List Box Properties.

image

Select Get choices from an external data source and then from the data source drop-down choose Supply Chain Scorecard Section.

image

Set Value = Goal_ID and Display name = Goal.  Check the box that says Show only entries with unique display names.  Click OK.

image

FILTER THE SECTION CHOICES BY THE GOAL SELECTED

Right click on the Section drop down list box and select Drop-Down List Box Properties.Beside the Entries box, click on the tree icon.

image

Click on the Filter Data button.

image

Click on the Add button.

image

Now we will specify on what to filter the Section list.  In the first drop down select Goal ID. Leave the second drop down as “is equal to”.  In the third drop down select “Select a field or group”

image

Change the data connection in the drop down to Main. Navigate to dataFields and then to Goal_ID.

image

Click all of the OK buttons. Publish your InfoPath form.

Add a new item.  Select Improve as your Goal.  Notice that the Section drop down has only those Sections that have a Goal of Improve.

image

And that is how you filter a drop down from another value in your list.

Use a SQL Server data source for an InfoPath pick list

This is one in a series of posts on using InfoPath. If you are new to SharePoint lists and InfoPath you might want to start with the Best Practices.  The example I will be using here is a continuation of the one started in the Best Practices post.

Best Practices for Cascading SharePoint Lists using InfoPath Forms

Dynamically filter a drop down on your SharePoint list by another user selected value

Before you even start, be aware that using this method allows the password for the SQL connection to be  stored in  text format directly in the data connection.  Be sure you put security measures in place to disallow others from navigating to the connection file in the SharePoint library, and create a database login with a low level of read-only security to information which is not sensitive.  We are using the data connections for simple pick lists, so this shouldn’t be an issue.

Create a SharePoint library to store the data connections for InfoPath.  The Type of library is a Data Connection library. Call it DataConnectionsInfoPath.  The data connection information for the InfoPath connections will be stored in text format so be sure that security to this folder is limited to only administrators.

Create a SQL login to the database to which you will be connecting to get the data used in the pick lists for SharePoint.  The login information will be stored in the data connection as text, so be sure to create a separate login with read only capabilities to only those tables required for the InfoPath pick lists used in SharePoint.  Choose a password that is not commonly used by other admin connections. (C’mon, you know you reuse passwords).

Create a view in your database from which the pick list will be fed.   The pick list can only be sourced from one table, so by creating a view you can control the values that show up in that table. If you need to filter out, change or add any values in the future, you can use the view to do it without having to change the data connection or the InfoPath form.

Navigate to the SharePoint list which has already been set up, in this case Supply Chain Scorecard Metrics from my previous blog post.  Open the InfoPath form by clicking on the Customize Form button in the List menu.  

To get the pick list for Site from a SQL Server view do the following.

Right click on the Site text box on the form and change the control to a drop-down list box.

image

Right click again and edit the properties.

image

Select Get choices from an external data source.  Click the Add button beside Data source.

image

Create a new connection to Receive data.

image

Receive data from a Database (Microsoft SQL Server only)

image

Click on the Select Database button.

image

Click on the New Source button

image

Select Microsoft SQL Server

image

Enter your server name and use the login you created at the beginning of this exercise.

image

Select the database where your pick list data is stored, and select the view you created at the beginning of this exercise.  The InfoPath data connection that you will be sharing in the SharePoint library will have the query imbedded in it, so you can’t just have a connection to the database.  You need to choose a specific table.

image

Save your connection file.

image

Re-enter you database password.  Click Yes to save your password into the InfoPath connection file in plain text.

image

Click Next.

image

Click Next, Next, Finish.

Leave the Value as @Site_ID and change the Display name to @Site by clicking on the folder tree to the right and selecting Site.

image

Click OK.

 

PUBLISH YOUR INFOPATH DATA CONNECTION TO SHAREPOINT

From the Data menu, select Data Connections.

image

Select vwSitePicklist and click on the Convert to Connection File button.

image

Key in the location of your InfoPath Connection library you created earlier, and add a name for the data connection you are creating.  Give it a suffix of .udcx.  Click OK.  It will take a minute or two to create the connection.

image

 

Click Close.

Navigate to your SharePoint library that contains the InfoPath data connections.  You will see your new data connection here.  Notice that the Approval Status is “Pending”.  You need to change it to Approved before you can use the data connection.

Click on the dropdown beside your new data connection.  Select Approve/Reject.

image

Select Approved.  Click OK.

image

 

Now go back to InfoPath and publish your InfoPath form.  Navigate to your SharePoint list and Add a new item.  You can choose a Site from the Site pick list which is pulling directly from SQL Server.

Pretty cool, right?  No need to push data into SharePoint and keep it synchronized, just pull directly from SQL Server.  I love it!