Protiviti / SharePoint Blog

Posts :All PostsUse SHIFT+ENTER to open the menu (new window).Open MenuOpen Menu

  
  
Body
Category
  
  
  
6/21/2019 1:59 PM

​As anyone in IT can tell you, what was true yesterday in terms of capabilities will most likely not be true in a year.  In fact, heavily adopted platforms like Cloud technology will more than likely not have the same limited set of features in three months' time as they do today. 

The Cloud is changing and the major players are competing in that market space.  That means one thing for sure, change will be constant.  Here are just a few updates for the Azure platform as of today.  Keep in mind these are big changes made on a global scale.  For more information on the rapid pace of change read here:
https://azure.microsoft.com/en-us/updates/
Image1.PNG

There has been a proliferation of comparisons between various Cloud Application Security Broker solutions.  A traditional CASB solution allows you to protect, track and control how your organizations data is being used in both on-premises and in the Cloud.   Influencers such as Gartner and Forrester are regularly churning out comparisons of top vendors such as Netskope, Microsoft, McAfee and Symantec.   Part of the reason for this new focus on CASB solutions has to do with the recognition in the Enterprise of the following:
1.    Users will and have been adopting the use of Shadow IT to facilitate business needs.  Unmanaged Shadow IT represents a risk to organizations.
2.    The Internet of Things has expanded incredibly while at the same time the list of regulations controlling sensitive data has also expanded.  Corporations are being pushed to comply with laws such as NYDFS, PII, GDPR, HIPAA, and SOX not to mention the slew of regulations imposed by other countries such as the UK and Germany.
3.    The need to collaborate with vendors and partners has increased with a consistent focus on real time updates and access to shared data.  Platforms such as OneDrive, Box, Drop Box, Google G suite, Salesforce and Service Now are critical to collaboration, but still must be regulated and controlled.
 
As a team, we have reviewed quite a few comparisons of Cloud Application Security Brokers and some appear to be biased toward or against particular vendors based on who the author is and whether or not specific vendors paid to participate and/or took part in the comparisons.  
 
Here is a snippet of a public comparison available now.  While these comparisons are useful there are pitfalls in relying on these alone to guide your business investments.
Image2.PNG

Today, I want to address a few key areas where the Microsoft Cloud Application Security Broker solution may be graded a little lower than you might expect in these types of comparisons.  An important reason the MCAS solution may get a little lower rating then one might expect is because the Microsoft Cloud platform is constantly being improved.  This is a good thing, but this rapid change process often leaves authors at a disadvantage when it comes to maintaining a current understanding of the most cutting edge features available.  A good example of the rapid pace Microsoft has undertaken in the Cloud is easily seen considering the many new features available since BPOS was first introduced. Taking into consideration that Microsoft started BPOS in 2010 and was mainly focused on mail and files use, not security, the new focus of the Cloud towards Governance and Control is easily observed.   Since 2010 we have been given new features focused on control such as Identity Protection, Identity Management, Information Protection, Conditional Access and Multifactor Authentication.  The pace that Azure and O365 continues to expand its feature set is impressive.  Access to this bevy of resources is available in your Azure Dashboard or at the newly revamped MCAS administration site.
 Image3.PNG

Image4.PNG

Here is a look at key features that make the MCAS solution a compelling choice especially when your cloud platform is O365, Box or Google G Suite.  New enhancements are in the works to fully manage other 3rd party platforms such as AWS, Service Now and Salesforce so stay tuned.
 
Encryption and Quarantine:  MCAS is given a very low score here vs other vendors.  Microsoft has recently added in enhancements to connect Azure Information Protection to the MCAS solution.  In short, you can apply a number of encryption / protection options on data in the O365 Cloud as well as on external 3rd party cloud applications.  This was done in the past using Active Directory Rights Management Services.  Microsoft has now improved on this offering in the cloud and continues to make advances.  A real win here is the ease in which you can create Protection labels and apply them to any of the content in O365.
 
 
Reporting:  In terms of reporting, there are numerous built in graphs and charts that should meet most reporting needs.  In the past, it could be argued that there wasn't much focus on reporting,  but Microsoft has turned that around with its investments in PowerBI.  If custom reporting is needed, logs can be accessed and utilized to create virtually any custom report using SSRS, tableau or other reporting tools.
Image5.PNG

Integration:  The key advantage of the MCAS solution is that it easily incorporates in the other Azure / O365 features including:  Privileged Identity Management, Identity Protection, Information Protection, Conditional Access, Multifactor Authentication, Data Loss Prevention, Advanced Threat Protection and Windows Defender as well as B2B and B2C solutions. 

If you are using Azure and O365 it will be hard to beat the MCAS solution.  As noted, other apps where MCAS is very powerful are Box, Google G Suite and Salesforce.   New integrations are being developed for AWS, DropBox and ServiceNow that should be ready in the next 6 – 12 months that will make MCAS even more attractive.
Image6.PNG

Scalability:  The MCAS solution can be configured to cover on-premise, Azure/O365 as well as key 3rd party solutions.  It is also easily configured for thousands of users using Azure Dynamic Groups.  The Microsoft Azure global infrastructure is nothing less than magnificent.  With redundancy built into every location it can compete with any other vendor.
Image7.PNG
User Actions:  User actions are tracked and monitored including upload, download, editing and unusual activity.  As an example, the MCAS platform is intelligent enough to baseline and understand what normal activity is and then alert when anomalous activity occurs.  Machine learning is being put to good use in the Microsoft Cloud.
 Image8.PNG

In short, the MCAS solution should be getting a higher ranking.  As mentioned, part of the difficulty for the authors is that Microsoft has been following a very aggressive release schedule not this year, but over the last three years.  They are on a role and I don’t suspect they are going to slow down as they move more and more into not the Cloud CASB market space. 

Our best advice is to be aware, keep reading and partner with an organization that is committed to staying on top of the many changes happening in the Cloud Platform space.   This will help you to make the best choices for your organization based on the latest information available as well as comparisons you may find online.  And most important, keep in mind that when it comes to the cloud, a comparison online can become quickly outdated.
 
If you would like more information reach out to ECM@Protiviti.com.

GeneralAndy Arellano, Protiviti
  
2/6/2019 2:29 PM

Organizations have been using Microsoft SharePoint as their collaboration platform and content management solution for over 18 years now. Over the years, SharePoint has grown in its capabilities and continues to become an effective tool that organizations use around the world.

We have found, however, that after an organization’s SharePoint site has been in use for an extended period of time, the site increasing in volume and becomes more complex, but is left without maintenance and are rarely monitored. The result is a complicated SharePoint site, usually with misconfigured permissions, old client information or hacked in code that executes within the sites. For example, there are often many unresolved errors in the Event or ULS logs.

In addition, organizations want to customize their site to not look like SharePoint or would like specific functions that aren’t available in the out of the box version. But, as we have audited these environments, we have found that many customizations live within the sites, and most of them were not added by developers, but by end users. They have found for example JavaScript code snippets online that are freely available and put them into Content Editor or Script Editor Web parts making their task much more manageable and the SharePoint interface better. These are customizations or applications that were never designed to be in the SharePoint site, but needs of the business have allowed this to happen.

The need to perform risk and health assessments is even more critical today. As the sites grow larger and larger, it becomes much more complicated to know what exists in the sites. And, with the move to Office 365 and SharePoint Online that need is even more critical. Our Protiviti team has performed many site assessments and we have realized that it takes not only automated processes but also manual tasks to help understand and build the picture of what is within the SharePoint environment.

Historically, we have executed PowerShell scripts and some 3rd party applications to build the whole picture. The problem with this approach is that they don't give us a full insight into the code that was added to the pages, and even what that code is doing. The biggest issue here with not identifying the code is the potential for security or performance problems if the code is doing something it should not. In many of the data and security attacks performed last year, a high percentage were from old code framework usage, lousy code or even just straight malicious code.

For example, simply adding code like this, though it may look harmless could be used for malicious purposes.



The code itself is not doing anything terrible, however, due to the nature of JavaScript coding it is injecting items into the "DOM" of the browser. The code is injecting a new "Image" directly at load time. The "IMG" tag is pre-populated with arbitrary JavaScript commands. The newly added "IMG" tag is then retrieved and using a common JavaScript "Eval()" function, ensures the hidden code is executed, no matter what that might be.

Understanding why you need to perform this level of Risk Assessment is imperative. If you are not sure, ask yourselves these questions:

Do I know where all the customizations are within my SharePoint environment?

Do I know how many Script Editor web parts are on the site?

Do I know where custom JavaScript is on pages?

Do I know what the code is doing on my site?

If you answer "NO" to any of these questions, then you need to perform a detailed risk assessment.

To help us do this, and for you, as an organization to be better protected, Protiviti is able to utilize a tool provided by our partner Rencore (https://rencore.com). Rencore as a company brings a wealth of experience to us, within the software development space, specifically around SharePoint Development. Their core products initially focused on code analysis and quality, ensuring that all code developed for SharePoint was built per best practices, identifying problems and offering resolutions. In the past year, Rencore has shifted to include application security both within SharePoint On-premises and SharePoint Online, providing tooling that can continuously scan and monitor your environment for changes and code or applications added to the sites.

In the next couple of weeks, a new tool from Rencore will become available allowing us to run a free risk and health assessment on a limited number of site collections either within SharePoint On-premises or SharePoint Online. The purpose of the tool is to inspect the sites looking for customizations which in reality are applications in their own right. The tool scans the sites, producing a report that provides not only details of items found within sites but also provides a rating from Critical to Good for the overall health of your SharePoint sites. Below is a sample report returned by the application.


This tool is available to all, as Rencore feels that all companies should know what they have within their SharePoint environments. In fact, to quote Rencore, "Discover risks before they hurt your business." After executing the tool, you are also able to perform a full risk assessment with the help of the Rencore team (https://go.rencore.com/discover-sharepoint-application-risks). After which you can even utilize the Rencore Platform for consistent monitoring or your environments.

Protiviti has steep expertise in audits and risk assessments and we provide support and assistance in mitigating the application risks found within your SharePoint On-premises or SharePoint Online site. With Protiviti and Rencore working together, we can help you to understand what is in your SharePoint Environment, potential security risks, code vulnerabilities and mitigations for these problems. When the tool is available, we will create a walkthrough of how you can use it and check your SharePoint environment. Let our team know today if you are interested in a personal walkthrough!

 

Governance; Office 365; SharePoint 2016; SecurityLiam Cleary
  
12/6/2018 3:45 PM

 

After an image file has been properly prepped (https://sharepoint.protiviti.com/blog/Lists/Posts/Post.aspx?ID=150) for uploading to a SharePoint picture library, the next step is to display the image on a web page.

The SharePoint ribbon’s Image editor provides control over basic properties, but it falls short of options for responsive image displays.

 

By using Edit Source mode, however, you can easily add a few styling attributes that will make your image display more fluid. A familiarity with HTML and CSS is useful, but you don’t need advanced knowledge to implement these small changes.

 

Percentage Widths

When an image is first inserted in the page content area, it is sized to fit the available width, plus a 5px margin all around it (Horizontal/Vertical Space). The HTML code for this set of properties looks like this:

<img src="/sitename/PublishingImages/caterpillar.png" alt="" style="margin: 5px; width: 940px; height: 705px;"/>

Unfortunately, this fixed-pixel setting for width and margins will not scale on smaller devices such as tablets and smart phones. When the photo exceeds the page width, users will be forced to scroll horizontally.

 

By changing the width to 100% instead of a specific pixel value, the image will scale up and down as needed. The 5px white space can be kept for the top and bottom margins, but it should be set to 0 for the sides so the image doesn’t exceed 100% (width + margin).

<img src="/sitename/PublishingImages/caterpillar.png" alt="" style="margin: 5px 0; width: 100%; height: auto;" />

If you don’t want to fill the entire width of the screen, the percentages for width and margins can be adjusted. For instance, you can style the image with a width of 80%, plus a side margin of 10% for a total of 100% (left margin + width + right margin). This centers the image on the page.

<img src="/sitename/PublishingImages/caterpillar.png" alt="" style="margin: 5px 10%; width: 80%; height: auto;" />

 

Min-Width

To ensure that your image doesn’t scale down too small, you can add a minimum width value to your style attributes. In this scenario, the image will scale for 80% of the screen width until the image is reduced to 320px, but it will not scale below that size. On a smart phone, the image would display closer to 100% of the screen.

<img src="/sitename/PublishingImages/caterpillar.png" alt="" style="margin: 5px auto; width: 80%; height: auto; min-width:320px;"/>

Max-Width

Conversely, you can set a maximum width for the image display to prevent it being enlarged so much that the quality degrades. This image will take 100% of the content area until it reaches 800px, at which point it will display centered on the screen no matter how wide the content area around it may be.

<img src="/sitename/PublishingImages/caterpillar.png" alt="" style="margin: 5px auto; width: 100%; height: auto; max-width: 800px;"/>

Testing your specs

To confirm that your image responds fluidly to a variety of situations, you can use these testing scenarios:

  • Increase page zoom to 125% and 150%
  • Decrease page zoom to 75%
  • Minimize your browser window to simulate tablet and mobile phone widths

If the image scales to fit the appropriate spaces, then you’re all set for a smoothly responsive web page.

 

SharePoint Design and UX; Tips & Tricks; Mobile DesignCarmen Carter
  
10/29/2018 1:14 PM


Microsoft Flow automates workflows between your favorite apps and services to get notifications, synchronize files, collect data, and much more. Flow can also be used to perform a Mail Merge using a couple of Microsoft SharePoint lists. This works great for sending the same email to all recipients or even structuring the email in different ways based on metadata captured in the Flow.

However, there are more scenarios where Flows are created using a service account with elevated permissions that only a select number of users have access to, typically IT.  The content of the email, however, may be set by someone else, such as a marketing or communications person.  In this scenario, changing the email template would require contacting the IT person and requesting for specific changes to be made - not ideal.

In this article, I am discussing the concept of using tokens inside an email template to allow easy formatting without any access to the Flow itself.

Creating the Email Template

We begin again with the email template.  In this example, we will create a template that has a few tokens incorporated in it.  I will create my email template in HTML format, but that is not a strict requirement.  Here's my template

<h1>Welcome email</h1>
<p>Hello <--FIRSTNAME-->,</p>
<p>I have to say, your last name "<--LASTNAME-->" is very nice.</p>

Not the best example, but I hope you get the point.  I am storing this example in my email template list in SharePoint.



Building the Flow

The flow is composed of four steps

1. Get the email template
2. Get list of recipients
3. Replace tokens with desired text
4. Send the emails




Get the email template
Start by retrieving the email message using the Get items.  To limit the results returned, use the Filter Query option.



Make sure that you only have one email with the same title.  Otherwise, you may get the wrong template.  This can be set by making the Title field in the SharePoint list unique.

Get list of recipients
Similarly to getting the template, use the Get items action to retrieve the list of users who should receive an email.  In my case, I have a field called SendEmail to indicated whether a user should receive an email or not.


Replace tokens with desired text
Here's where the fun begins.  I used the Compose action for each token to replace the token placeholder with the actual value.


Before updating the tokens, I store the email template in a variable.  This way, I can always retrieve the original template for the next email being generated.  You may recall that I have two tokens in this example - <--FIRSTNAME--> and <--LASTNAME-->.  Each one is being replaced separately, although it's possible to do all the updates in a single action.

To replace the <--FIRSTNAME--> token with the name from the user in the SharePoint list, I use the following expression in the Compose action.


replace(variables('Email Template'),'<--FIRSTNAME-->',items('For_each_Recipient')?['First_x0020_Name'])


This action effectively looks up the <--FIRSTNAME--> token in my Email Template variable declared above and replaces it with the First Name of the user from the SharePoint list item.

Next, I proceed with replacing the <--LASTNAME--> token with the Last Name of the user from the SharePoint list item.


replace(outputs('Replace_First_Name'),'<--LASTNAME-->',items('For_each_Recipient')?['Last_x0020_Name'])


You'll note that the second expression input is different.  That is because the input for the second Compose action is in fact the output of the Replace First Name Compose action.

Send Email
In the final step, I am sending the email to the users by looping through the list of all recipients from the Get Email Recipients action.  The body of the email is Output, which is the output of the Replace Last Name Compose action.




Make sure that if you're structuring your email template as HTML (as I did), that you indicate in the Send an email action that it's HTML format.  Otherwise, you'll get a lot of tags in the body of your email.

Once the email is sent, I then reset the Email Template variable to what it originally was with the token place holders so that the next iteration can correctly replace them.

This approach works well, but there are two caveats to consider:

1. if the tokens are incorrectly written, then they will not be updated in the email itself
2. if you want to add additional tokens, then IT will need to add rules for parsing it and replacing it.


To see more 'How to' posts on Microsoft Flow, visit Haniel's blog: https://www.agileo365.com/.

Office 365; Tips & Tricks; GeneralHaniel Croitoru, Microsoft MVP
  
7/6/2018 11:31 AM

The best practice for styling a SharePoint site collection is to centralize all CSS in a style sheet (or set of style sheets) that are referenced on the master page or page layouts. This approach ensures a coherent and consistent design and provides the necessary control for long-term management and updates of the site design.

There are times, however, when adding new styles rules to the site design is not practical. When you need to style a unique block of content, rather than site-wide SharePoint components, adding custom styles to a single web page can be the better solution.

This HTML-based graphic is an example of unique content that only need to be styled locally:

Example graphic 

Any user with Contributor permissions can edit these page-based styles, which can be a double-edged sword. The ease of access to styling can also result in easy page wrecking, but the damage can be remedied by deleting the styles from the page.

There are several different methods for applying a set of style rules to an individual page. Understanding the advantages and disadvantages of each will help you select the most appropriate one for your content.

Script Editor Web Part

The most simple and direct way to style unique content is with a Script Editor web part.

First, add your text block with HTML markup to the page contents. You will need to work in Edit Source mode to ensure the markup is kept intact.

HTML Source panel 

Next, select a Script Editor web part from the Media & Content category and add it to any web zone on the page. Modify the web part title to clearly identify its purpose (i.e., “Cornerstone Design Styles"), then click the Edit Snippet link and paste your CSS styles code into the Embed pane. The HTML text block should appear styled as soon as you click Insert.

Embed panel 

The only disadvantage to this method is that the HTML markup and CSS can be easily overwritten. If the page is edited often by numerous people, you may want to try another less exposed method.

Style Sheet Reference LinkM

Large blocks of CSS are more easily and safely stored in a traditional style sheet saved as a .css file. This file can be uploaded to the SharePoint Style Library or to the Documents library of the site that contains the page where your graphic is being displayed.

To reference the style sheet, add a Script Editor to the page and paste the style sheet link into the Embed panel:

Embed panel 

Editing the style rules is a more cumbersome process, but if the link is accidentally deleted from the page, the styles are easily restored. This approach can also be used to apply the same styles to more than one page, but multiple references are usually a sign that these styles should be integrated into the main design rather than applied manually to individual pages.

Content Editor Content Link

If you’re concerned that complex HTML markup will be inadvertently overwritten if exposed on the page, you can save both the HTML and the CSS style rules in a single .txt file. This file can only be displayed in a web zone, so you may need to rearrange other text on your page to accommodate this new positioning.

First, upload this .txt file to any document library where it is easily referenced.

Text file 

To import this graphic back into the page, add a Content Editor web part to a web zone and enter the URL to the file in the Content Link field. Edit the Appearance attributes, such as Chrome, and then click OK or Apply to see the embedded graphic on the page.

Content Editor Content Link 

If you would like to display this combo package of text markup and styles in multiple site locations, the Content Editor web part can be Exported and then uploaded to the Web Part Gallery.

Deleting Page-based Styles

As mentioned above, any design issues or conflicts created by the custom styles can be easily resolved by deleting the web part displaying or linking to styles. This can usually be done just by editing the web page directly, but if the styles inadvertently make the page inaccessible, they can also be deleted through the Web Part Page Maintenance view.

Tips & Tricks; SharePoint Design and UXCarmen Carter
  
6/8/2018 9:27 AM

SharePoint default styling allows for the quick creation of sites and pages that are ready to publish and show to users without worrying about design work beforehand. A custom master page, on the other hand, can create a unique look-and-feel to your site, but this redesign requires a substantial effort to implement and advanced SharePoint/designer skills.

In between those two extremes, there are ways to add small custom CSS touches to a sub site or a specific web page using out-of-the-box methods. Keep in mind, however, that these techniques are best suited for limited scopes, not for in-depth site-wide design implementations.

Cascading Style Sheets (CSS)

To understand how new CSS rules affect the SharePoint design, you should understand the "cascade" principle of CSS. Both the way in which style rules are referenced and the order in which they are applied, determines which rules have the greatest weight or priority over others. When style rules conflict, the rule with the highest priority wins.

Master Page Style Sheets

The OOTB style sheet links coded in the <head> of the master page are at the top of the CSS cascade. Each successive file has more priority than the one above it, so the style rules in the very last file (SuiteNav.css) will overwrite any similar rules in the style sheets stacked above it.

SharePoint 2016 master page cascade (Seattle):

<head>
<link rel="stylesheet" type="text/css" href="/sites/(site name)/Style%20Library/en-US/Themable/Core%20Styles/pagelayouts15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/Themable/corev15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/Themable/corev15.css"/>
<link rel="stylesheet" type="text/css" href="/sites/(site name)/Style%20Library/en-US/Themable/Core%20Styles/controls15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/Themable/searchv15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/QCB.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/SuiteNav.css"/>
</head>

Alternate CSS URL

An additional style sheet link can be added into the <head> area, right below the original master page style sheets, with a simple site configuration change.

Upload a style sheet (.css file) to the site collection Style Library or to any SharePoint document library. Then browse to the Master page settings (Site Settings > Look and feel > Master page) and fill in the Alternate CSS URL field to point to that new file.

This method can work across the site collection if it is added at the root and all other sub sites inherit from the top level Master page settings. It can, however, be added through the Site Settings of a specific sub site instead. This would provide quick customization of a single site without needing a new/custom master page or page layout.

 

The link to this alternate style sheet will be injected into the <head> area of every web page in the configured sub site. Rules in this alternate CSS file should overwrite rules in the files above it, as long as each custom style rule mirrors the exact same selectors and syntax as the rule it replaces. (Selectors also influence the priority assigned to a style rule.)

<head>
<link rel="stylesheet" type="text/css" href="/sites/(site name)/Style%20Library/en-US/Themable/Core%20Styles/pagelayouts15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/Themable/corev15.css"/>
<link rel="stylesheet" type="text/css" href="/sites/(site name)/Style%20Library/en-US/Themable/Core%20Styles/controls15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/Themable/searchv15.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/QCB.css"/>
<link rel="stylesheet" type="text/css" href="/_layouts/15/1033/styles/SuiteNav.css"/>
<link rel="stylesheet" type="text/css" href="/sites/(site name)/Style%20Library/CSS/custom-styles.css"/>
</head>

The obvious limitation to using the Alternate CSS URL field is that you must have the appropriate user permissions to access the Master page settings.

Custom Page Styling

There are times when you need custom styles applied only to a single page or a small set of pages. There are several ways to do this, and these methods can be used by anyone with content-authoring permissions that include editing web parts.

External File References

Upload your custom style sheet (.css file) to the Style Library or to any document library. Then, using a Script Editor web part, you can add a link to that file directly on your web page.

 

Unlike the Alternate CSS URL, this link will be inserted in the <body> of the page, where the Script Editor web part is located on the page layout, not in the <head> area. This link reference will have greater priority than the style sheet links referenced above it.

Source code in the <body> of the page:

<DIV class="ms-rte-embedcode ms-rte-embedwp">
<link rel="stylesheet" type="text/css" src="/sites/(site name)/Style%20Library/CSS/custom-styles.css" />
</DIV>

Embedded Styles

Style rules that are embedded on a page using an HTML <style> tag have even greater weight than rules added through an external file link . The easiest method to add a <style> code block to a web page is to paste it into a Script Editor web part.

 

If you’re concerned these styles will be accidentally deleted or overwritten, you can save the entire <style> block in a .txt file, upload it to a document library, then embed it using a Content Editor web part Content Link field. To prevent the styles from displaying on the page, set the web part to Hidden (Edit Web Part > Layout > Hidden checkbox).

Either approach will embed the styles directly into the page layout HTML, at the location of the web part:

<div class="ms-rtestate-field">
<style type="text/css">
body { background: #fff; color: black }
p em { background: yellow; color: black }
.note { margin-left: 5em; margin-right: 5em }
</style>
</div>

Best Practices

As useful as these techniques may be, they are also easily abused and can result in a loss of control over the designated site design. Long-term site maintenance can become burdensome with too much reliance on these locally applied styles.

A future article will provide use cases, guidelines and best practices for using embedded styles on SharePoint web pages.

Sitecore; SharePoint Design and UXCarmen Carter
  
6/1/2018 10:07 AM

I was recently asked by a client to write a password reset flow for a custom web application, with the ability to mass reset all passwords. Time was of the essence, and they had not interfaced with their SSO service using a custom application yet. I did the first thing any good developer/consultant would do when asked for proprietary custom module that has been repeated by many applications in the software world: I checked the internet for the industry standard.

However, there was no fully descriptive industry standard to replicate, so after studying the forgotten password flows of common apps we use today, and going through the process of creating my own, I decided to write a blog post outlining a pseudo-standard with specifications. I will first discuss what I gathered from the industry leaders and then go into detail about the flow that I created.

Industry Leaders – What they have in common

The general flow of the password reset process for each industry leader that I tested had pretty much the same features. The applications that I took a look at were Facebook, Twitter, and Gmail.

Each application provided the password reset option outside of the scope of a logged-in user so that users who forgot their passwords could also reset their passwords.

Each application had a recovery email on file to send emails that would kick off the password reset process. For each application, those emails contained a link back to the website with a temporary ID or token that would be used as a key into the resetting of the password and choosing a new one. After a user picks a new password and confirms it by keying in the password once again, a button is clicked, a reset confirmation email is sent, and the reset is complete.

Industry Leaders – What is different

Although the flow was very similar for our three industry leaders, there were some contrasting qualities for each. The length and syntax of the temporary tokens, the form of the tokens, and the other methods provided to reset the password (besides through email) were some of the main differences between the three applications.

Facebook uses a 6-digit randomly generated numeric token as their reset code. This code resets each time the user clicks the “Continue” button (photo #2 below). If the user tries an old code, he or she will redirected to an invalid token page. Facebook provides the code in the email as well as a link, both of which can be used to reset the password. Facebook also sends an email to each of the emails on file; the user does not have a choice in this.

Facebook 

Gmail, like Facebook, uses a 6-digit randomly generated numeric token as well. Unlike Facebook, Gmail does not reset the value of the token each time the user begins the reset process. For example, if the user requests a token to reset his or her password, never types that code in, and then requests a new token, the user will receive two emails with the same token value. Gmail only sends the password token emails to only the single recovery email on file, and provides only a code (no link).

Google 

Twitter is a bit different than both Gmail and Facebook. Twitter uses a 43-character alpha-numeric token along with a reset id, which is quite a bit more secure than the 6-digit tokens of both Gmail and Facebook. Twitter’s 43-character alpha-numeric would be virtually impossible to guess, with 36^43 possibilities (about 8.3*10^63 combinations). A numeric of length 6, on the other hand, would have just about a million possible combinations, pretty easily cracked by a brute-force hack. Twitter sends users their reset token in the form of a double encoded link; it does not provide a visible code to be copied and pasted like its Gmail and Facebook tech-leading comrades.

Password Reset window 

Encoded Reset Password Link:
Code block 

Token: GVmeSd07RmOe_bZbD_XfcC5L_1Y_wKLkYEIIHdBAL7U=-1526313952655

A side note on the industry leaders: All of them did have other options for resetting passwords, from storing and typing in old passwords, to texting the user’s phone on file, to using another account linked to the application. However, for the custom business application I was building, none of these were an option; therefore, I do not discuss them in this blog post.

My Reset Flow

Based on password reset flows of the industry leaders discussed already, I moved to create my own password reset flow for the application. The main specifications took the same form as the industry leaders that were discussed earlier, which I will outline in detail here. As for the token, I decided to align more with twitter as it appears to be more secure, at least on the surface. By that I mean Facebook and Gmail may have some additional brute force protection against malicious users trying to guess a valid password reset code of 6 numerics. I did not create the black hat application to see if they did. Regardless, for my application, I decided to go with the more secure token without the hypothetical brute force protection.

Forgot My Password Link and Page

  1. First, on the login page, or anywhere outside the logged in scope, the user clicks on a forgot password link that redirects the user to forgot my password page.

  2. On the Forgot Password page exists

    1. A form field for the user’s email address
    2. Contact information (tech support) and information regarding the password reset process
    3. Validation to verify that the user’s email addess takes the form of a real email address
    4. Submit button to trigger the email address submission and the reset process.
  3. Once an email address of correct form is entered and the submit button is clicked:

    1. The session cookie and the email address is sent to the server.
    2. The server processes the email address and validates the email address belongs to a valid user
    3. If the email address is valid, the server sends the user to the password reset completion/processing page and begins processing the reset request

Server-Side Processing of Reset Request

  1. Once the server validates the email address, it begins the reset password process and logs the request.

  2. The server randomly generates, with significant entropy, a password reset token of 48 uppercase and lowercase alpha-numeric chacters. The server stores the token in a database table along with the date and time of its creation, the email address of the creator, and the IP address of the browser associated with the reset request.

  3. To preserve the temporary-ness of the token:

    1. If a password reset token already exists in the database table for the user’s email address, it is removed.
    2. If a password reset token already exists in the database table for the user’s IP address, it is removed.
    3. A job will be run on a weekly basis to empty the temporary password reset table
  4. The server will send an email to the user’s email address containing the token encoded in a URL to the “new password page”.

Getting a new password

  1. The users receives the email with the encoded link containing the temporary and randomly generated token.

  2. The user clicks on the link which to a new view within the application where he or she can choose a new password. The token is passed to this new view as well via a URL parameter.

  3. On the new password page contains:

    1. Contact information (tech support) and information regarding the password reset process
    2. Two form fields for password entry. Both entries must be identical and contain at least 8 characters: at least one upper case alphabetic, one lower case alphabetic and one numeric.
  4. The page will contain a submit button that will initiate the server-side “password update” process and direct the browser to the “login” page.

  5. The submit button will also send the password reset token to the server-side “password update” process as a hidden field.

  6. The browser page direction to the “login” page will happen synchronously, but the server-side password update process will happen asynchronously.

Password Update Process

  1. Upon initiation of the server-side password update process, the server will compare the submitted password reset token to those stored in the database. If no token match is found, the process stops. If the token was stored more than 24 hours previously, the process stops.

  2. The server will examine the submitted new password for compliance with strength requirements. If not present or not sufficiently strong, the process stops.

  3. The server will store the new password in the user authentication table in the database, and remove the password reset token.

Security; Admin; SharePoint DevelopmentJake Mozack
  
5/30/2018 9:13 AM

The world has been working with the MS cloud extensively for the last 8 years It seems like we almost have it mastered. Still, there are important nuances and connections that are not readily apparent without looking deeply into the documentation.

For our team, we use SharePoint Online so often that it is easy to forget there are at least 3 other key components to the cloud. Azure being one of the most important.
 

You can get to the Azure dashboard by using your O365 admin link from office.com

 

Often, Azure is a distant entitty that rarely gets looked at after the initial rollout. Let's take a look at why we should be looking at the health and functionality of Azure at least on a monthly basis.

First, lets be sure we are all on the same url: https://aad.portal.azure.com/#blade/Microsoft_AAD_IAM/ActiveDirectoryMenuBlade/Audit

So, what is the role of Azure? Here are some key points to keep in mind:

  • Virtual machines
  • Virtual applications
  • User Import from Active Directory
  • Password syncrhonization
  • Multi-Factor enforcement
  • App usage statistics
  • User synchronization with O365 and SharePoint Online
  • Conditional Access
  • Risky sign ins
  • Azure Auditing

As we can see, there is a lot going on at the Azure location. Everything from importing users, to controlling how and where users can access your tenant.  Some interesting points come up here such as:

  1. If we want to have advanced user logins with dual factor authentication we need to look into multi-factor authentication based in Azure

  2. If we have an issue where some users from Active Directory are not synchronizing to O365 and/or SharePoint then we need to look in the Azure AD Connect functionality.

  3. If we want to have advanced control and management of how users sign in then we need to look at Azure and ADFS

  4. If we want to control where and how users log into our tenant then we need to look into Azure Conditional Access.

  5. If we want to understand any actions taken related to security, external access etc. Then Azure will be a key source of information.

Granted some of these option have an additional cost, but the functionality currently availble through Azure in the cloud is similar to having your own on-premises Data Center with hundreds of thousands of dollars worth of hardware appliances not to mention the administrators to support those appliances. At the very least, we encourage all users paying for Azure / O365 to take advantage of those features which are free just because you are a paying tenant.

Hopefully this post will help start filling in the details as to how Azure fits into your overall security and functionality roadmap. If you would like more details and guidance reach out to the Protiviti ECM team. We would be happy to work through those topics that are most important to your organization.

Office 365; Admin; SecurityAndy Arellano
  
5/18/2018 1:19 PM

Nintex is a wonderful tool. I say that because it is. However, there are times where working with Nintex can lead down a rabbit hole of “why isn’t this possible”.

Recently, we came across a situation where we needed to pass a managed metadata value into a workflow through a workflow start form.  On its own it sounds simple, right?  Well…it required a bit of The Midas Touch.  Nintex workflow start forms don’t pass managed metadata into the workflow itself as a variable.  It’s the only type of column that isn’t supported. 

The workaround for this was to create a text field on the start form and hide it.  JavaScript was written so on submit, it would read the managed metadata column value the user put in, grab the label, and put the label into the hidden text field which was connected to a workflow variable.

Targeting the fields is another hurdle.  There is no reliable way to target a field on a Nintex form without adding a class to the field.  The trick here, is to add a css class through Nintex forms to target with the JavaScript.  Click the container that the field is kept in and add a CSS class at the top.

 

Once that is set, double-click your submit button and under the Advanced tab, in the Client click section, type: return validateOnSubmit();

validateOnSubmit() will be our JavaScript function to copy the label over to the text box for use in the workflow.

 

The JavaScript itself looks like this:

var validateOnSubmit = function () {
    var sourceTarg = NWF$('.targ-tax input');
    var destTarg = NWF$('.dest-tax input');
    var terms = sourceTarg.val();
    /*terms is returned as Label|Guid which can be used to update the value in the workflow. For this project we just needed the GUID as we were looking up to the Taxonomy Hidden List to get the full path of the term. The following splits the terms functions at the bar and returns just the GUID and may not be needed in your code*/
    var term = terms.split('|')[1];
    destTarg.val(term);

    return true;
};

That should go in the Custom JavaScript portion of the form settings of the start form of the workflow.  Remember to return true in the function so the submit button does our fake validation properly.

Go ahead and test it out and Nintex should be able to use the text field as a workflow variable.

Taxonomy Hidden List

As mentioned in the code comment we were also utilizing the Taxonomy Hidden List to get the full path of term by the GUID. If you are not familiar with the Taxonomy Hidden List, it is unsurprising a hidden list that lives on each site collection at /Lists/TaxonomyHiddenList and has some great details about the Managed Metadata Values used in that Site collection. Note: This list only contains values for terms used in the Site Collection and does not contain values for the Term Store as a whole.

In our instance we were using the IDForTerm column to match the GUID (passed in as a workflow variable) to get the Path column. One thing you will probably notice below is there are 3 different columns for Term and Path. The TaxonomyHiddenList will also return any translated value, so if you have a multi-lingual intranet you also view all of the translated values here. In this instance you can see the German values (1031) and the English values (1033). The CatchAllDataLabel can also be used in translated intranet as this field contains all of the translated values for the term in question.

 

Other Uses

We also found this same code snippet to be handy in passing in people values as well. While the Nintex start form allows you to pass in people without an issue. We had an instance where we had to dynamically limit the people picker to a SharePoint group and found that Nintex requires you to specify a group in the workflow variable itself and does not present this option if a Start Form control tied to a variable.

However, by using the Get Query String function and some front end logic we were able to pass in a value through the URL to the Start Form to limit the selection as needed in the start form control. Then using the JavaScript function above pass that value into the workflow variable control.

 

Nintex; Metadata; Tips & TricksMike Schaefer and Matt Lavieri
  
5/11/2018 3:07 PM

Our clients often express the logistical and practical challenges of UAT (User Acceptance Testing). These challenges include inaccessibility to business users due to competing priorities, long test cycles, lack of end user training on the application and lack of visibility into the process. A few months ago, I was engaged on a project to develop a UAT (User Acceptance Testing) strategy and support its execution. This client had a unique and refreshing approach to UAT and avoided some of the common obstacles.

The client planned to involve a large population of end-users in UAT testing. The effort was carried out in 6 waves, with each wave performing a week of training and testing.

The below are a few key attributes that contributed to the success of the weekly UAT waves:

  1. Secure commitment at the top.
    A common UAT challenge is that the testers have too many other priorities competing with the UAT tasks. Success is not going to be possible without leadership communicating loud and clear to the testers and their supervisors that UAT is the most important activity for that week.

    Our client’s leadership reinforced the importance of the testers’ participation in UAT in each weekly session by inviting executives to speak to the group about the impact that their involvement is making for the company. The result was a motivated and engaged community of business user testers.

  2. Get everyone together in one place, outside of the typical workplace.
    Communication is a major factor in ensuring that the testing is productive. While collaboration technologies are becoming more prevalent in today’s business environment, there is no replacement for a face-to-face conversation where feedback flows more freely and non-verbal cues aid in perceiving user experience. Co-locating everyone involved with the testing sessions reduced delays and interruptions that typically occur when clarifications or technical troubleshooting is required. Ancillary benefits of removing testers from their usual environment is that it that it reinforces the concept that leadership sees this as a priority, the testers are recognized for their participation and it encourages networking within the organization that typically would not occur.

    In addition to the tester participants, the client organizers also invited representatives from the following groups at the sessions:

    • Product owner and corporate trainers with in-depth knowledge of the application
    • IT directors and analysts who define requirements and test the application
    • Implementation and QA vendors
    • IT infrastructure and help desk to ensure network stability and security/access

    Collectively, the supporting staff was able to gather feedback effectively and remove impediments quickly, providing a positive experience for the testers.

  3. Bring the right equipment.

    While seemingly obvious, this is an aspect of testing that is often overlooked. UAT testers need devices that that are similar to what they have in their office and out in the field for testing. Are they usually using tablets on in the field? Do most of them have a desktop computer? The answer given is the device they should be using to perform testing activities. In addition to testing devices, equipment for collaboration and productivity should be available.

    Here is some of the equipment that was available during the sessions: Laptop docking stations, supplies for wireless and wired internet, projector, printers and easels (for white boarding).

  4. Set ground rules and expectations
    Communicating ground rules and expectations allowed our client organizers to maximize the efficiency of testers and resulted in a better experience for the entire group. Our client organizers asked the tester participants to put their devices away and postpone work tasks while executing testing. They were also respectful of the testers’ work obligations and ensured there were plenty of breaks to answer calls and emails. This allowed testers to focus on the task at hand.

    Expectations were also communicated prior to the commencement of testing. The organizers acknowledged that the software tested was not the release candidate and would have several iterations of revisions prior to going live. They also shared that while they would document ideas and enhancements, these items would be deferred to a later release. This resulted in testers having a positive impression of the state of the project and directed their focus on reporting critical business process issues.

  5. Plan, plan, plan
    An organized UAT event is a successful UAT event. Thorough planning shows that the organizers value the tester’s participation and it prompts the testers to demonstrate the same qualities. The client organizers defined a detailed agenda for each day of training and testing. In partnership with Protiviti, they defined test scenarios, test scripts and a process for tracking progress and defects. The agenda contained more activities than we anticipated could be done within the timeframe. When things didn’t go as planned, they easily switched tasks and the testers continued to be productive.

    Another success contributor in the process was the effort the client organizers made to test the application thoroughly prior to each round of UAT. While there were occasional unexpected blockers, the overall impression was that the application was stable and unnecessary pauses were avoided.

  6. Visualize progress, status and metrics
    A key UAT objective of our client’s leadership was ensuring full test coverage of functionality. Protiviti designed informative dashboards which tracked test execution progress, results and outstanding activities. The charts provided visibility to a wide audience and allowed leaders to make informed decisions.

    Dashboard Examples

Most importantly, the client organizers were generous in their appreciation of the testers. They shared their gratitude, introduced “rituals” and ice breakers, provided treats, took breaks to connect personally and even had elaborate birthday/ work anniversary celebrations in honor of some of the tester participants. As an observer of this process, it was a memorable experience for me and I imagine that it built a foundation of collaboration and between the Information Technology group and its business user community.

User Adoption; Project Management; Quality Assurance and TestingJulia Bliss
  
5/7/2018 9:37 AM

  1. Site Collection Associations: Since Hub Site associations are at the Site Collection Level you should consider creating new Site Collections for most new sites. This will allow the greatest flexibility to grow and adapt with your business. If your business were to ever go through a reorganization you would not need to migrate data, or have anyone update their bookmarks, all you would have to do is associate the Site Collection to a new Hub Site and you are done.

     

  2. Use the Content Type Hub: I detailed using the Content Type Hub in a previous post and mentioned its importance in the Modern UI as everything in the O365 SharePoint is moving to Site Collections. To give users the most control of what data rolls up when configuring web parts it is imperative that all of the content in your O365 is using the same metadata to ensure rollups function appropriately.

  3. Themes, Navigation, and Logos: After you associate a site to a Hub it will have the same Theme, Navigation, and Site Logo as the Hub Site. This way all sites belong to the same Hub will have a consistent user experience. Additionally, if you have built out an intranet and have a homepage I would also suggest adding a homepage link as the first link all of your Hub Sites to maintain a cohesive experience across all sites.

     

  4. Get Familiar with PowerShell: At the time of writing this the only way to manage Hub site sis through the use of the SharePoint Online Management Shell. While I could spend the rest of this blog detailing out the cmdlets, Microsoft has already provided the great documentation on them all: https://docs.microsoft.com/en-us/sharepoint/dev/features/hub-site/hub-site-powershell

    PSA: If you haven’t checked out the new docs.microsoft.com for SharePoint I would strongly recommend bookmarking the site: https://docs.microsoft.com/en-us/sharepoint/

  5. Take advantage of Hub Site associations: While one of the selling points of using a Hub Site is to aggregate all content from the associated sites into the Hub, the associated sites can also take advantage of this functionality and aggregate data from their ‘Sibling’ sites. This feature can come in handy to always ensure all sites in the same hub are always showing the same calendar, so no one misses that important meeting.

     

Enterprise Content Management; Tips & Tricks; SharePoint Design and UX; Office 365Matt Lavieri
  
4/27/2018 10:30 AM

Cool designs make people excited to use new technology. Whether the new system will be a total overhaul of a company’s intranet or a smaller solution tailored to one department, the end users are oftentimes the most interested in what the final site will actually look like. Where will their calendars go? What will the navigation contain? Where can they update the daily menu for the food vendor in the office?

A wireframe is a design containing a collection of boxes and interactive elements that becomes the roadmap for implementing a web-based solution. It gives developers an end goal: users need to display the site navigation here, and the dropdown menu needs to contain these links, etc. These mockups of websites give users an opportunity to play around with spacing, content placement, and the idea that maybe the finance department’s birthday calendar doesn’t need real estate on their organization’s homepage.

From drawing squiggly boxes on a notepad to drafting fully functional prototypes that are hosted on third-party websites, wireframes can be as basic or elaborate as the stakeholder needs. Sometimes those squiggly hand-drawn boxes give users more flexibility and creativity when envisioning their new site, but sometimes they want to “see the real thing” before committing to a build. That’s when it’s time to download Axure.

Axure is a wireframing tool that allows users to turn wireframes into true prototypes. It is in the same family of applications as Balsamiq, but is more comparable to a Photoshop-esque tool. Users can add animations, functional navigational elements, and a variety of interactions to build a prototype that feels like a real website. Designers can import images and screenshots to build a prototype that looks identical to a finished product. Axure’s best feature is the web page preview – a user can easily generate a privately hosted URL that lets team members interact with their prototype in a web browser.

screenshot of the wireframe, highlighting a feature
Figure 1:A prototype created in Axure

screenshot of the wireframe
Figure 2: The prototype above displayed in a browser

It's important to note that while Axure has a host of benefits from the perspective of an experienced designer, there are just as many teams who don’t need a functional prototype to get to work on a new site. Even when building a prototype, it’s important to start with basic wireframes and add embellishment as the design progresses. At its core, a wireframe is only as good as the content it describes.

So before starting on the design phase of an engagement, ask yourself these questions to determine if your stakeholder needs a functional prototype or a basic wireframe:

  1. Does the stakeholder have strong branding guidelines?
  2. Has the stakeholder compiled examples of sites they would like to use in their design?
  3. Is the project team familiar with website design?
  4. Does the team need to present a demo of the site to management?
  5. Will the site have many interactive elements (animation, transition, scrolls, etc.)?
  6. And we’d be remiss not to ask – does the project’s budget allocate adequate time to build a prototype?

If the answer to all of these questions is yes, a prototype is the way to go. Otherwise, stick with a basic wireframe. Even those squiggly lines will do.

Tips & Tricks; SharePoint Design and UX; User AdoptionLily Carmody
  
4/20/2018 10:38 AM

​Overview

eDiscovery (Electronic Discovery) is the process of identifying, finding and capturing electronic information to be utilized as evidence in legal cases. eDiscovery is one of many features built into the Office 365 Security and Compliance Center. eDiscovery allows authorized users to search, investigate and place Office 365 content and conversations on hold for a legal purposes. Content can be found and held based on its location, content conditions, as well as keywords/phrases within the following services:

  • SharePoint Online documents
  • Email content
  • OneDrive for Business documents
  • Group / Shared Mailbox content
  • Microsoft Teams content
  • Skype for Business conversations

Basic eDiscovery in Office 365 is fairly intuitive and we will break down the process in the following 4 steps:

  1. Office 365 eDiscovery Roles
  2. Creating & Managing eDiscovery Cases
  3. Placing Content Locations on Hold
  4. Performing Content Searches and Exporting Results

As a note, Microsoft is currently rolling out modern user experience within eDiscovery. Tenants all will get this new user interface at different times, with the option to return to the classic experience if users choose. Screen shots within this blog will show the modern user experience, but note that changes are constantly happening and pictures here might vary from what you experience in your tenant. Some functionality is slightly different between the two experiences, but mostly it is the same tools and features displayed differently to the users.

1. Office 365 eDiscovery Roles

As a first step, the appropriate permissions must be assigned to allow users the ability to interact with the eDiscovery Center and cases. A user has to be a member of the Organization Management role group (or be assigned the Role Management role; or a Global Admin) in the Office 365 Security & Compliance Center to assign eDiscovery permissions. eDiscovery managers will then have the ability to create and manage case, and add users to eDiscovery cases. eDiscovery roles are broken into two groups, Reviewers and Managers (which are further distinguied as either Managers or Admins).

eDiscovery roles and the corresponding permissions are as follows:

Role Description Allowed Activities Activities Not Allowed
Reviewer Reviewers can only see and open cases on the eDiscovery page that they have been made members of by a Manager or Admin.

Can:

  1. View / open their cases

Cannot:

  • Create cases
  • Add members to a case
  • Place content on hold
  • Create searches
  • Export results
  • Prepare Advanced eDiscovery results
eDiscovery Manager Managers can create eDiscovery cases and manage all activities of cases that they are members of.

Can (only for their cases):

  • View / open their cases
  • Add / delete case members
  • Place content on hold
  • Create / edit content searches in case
  • Export content search results
  • Prepare results for Advanced eDiscovery

Cannot:

  1. View / open other cases (cases they are not members of)
eDiscovery Admin Administrators can perform all case management tasks that an eDiscovery Manager can do, while also having access to ALL eDiscovery cases and the ability to perform all Advanced eDiscovery tasks.

Can (for all cases):

  • View / open ALL cases
  • Add / delete case members
  • Place content on hold
  • Create / edit content searches in case
  • Export content search results
  • Prepare results for Advanced eDiscovery

eDiscovery Roles are assigned by an authorized user (Global Admin / Org. Manager) in the Office 365 Security & Compliance Center. To get to here, navigate to https://protection.office.com. Select the desired role and edit that role within the management panel.

screenshot 

Once all of the necessary users have been assigned the appropriate roles within the Security & Compliance Center, eDiscovery Managers can begin to create / manage eDiscovery cases and add users to those cases.

2. Creating & Managing eDiscovery Cases

Before creating and managing eDiscovery cases, it is important to understand that a "case" is simply a logical container or grouping of content holds, searches and results within eDiscovery. Case can have one to many holds and searches within them.

screenshot 

To begin performing eDiscovery work, from the Security & Compliance Center, select the Search & Investigation on the left-hand menu and click "eDiscovery". eDiscovery Managers and Admins can create new cases at the top of the page by selecting Create a case, then providing a name and description for the case.

screenshot 

Upon creation, Managers can quickly perform the following tasks from the case management fly-out:

  • Add / remove / search for case members
  • Edit the name / description
  • Close / delete the case

screenshot 

Opening the case will give the manager full access to the case and the eDiscovery features. From the case management page, users can complete the next 3 activities:

(3) Creating and managing Holds
(4) Creating and managing Searches
(5) Exporting Search Results and preparing them for Advanced eDiscovery

screenshot 

3. Placing Content Locations on Hold

Within a case, the first thing the user should do is create a hold on content. A hold simply preserves content from being modified or deleted through the discovery process. Content on-hold cannot be deleted by any user, while the files and their contents are under review. Office 365 users will not know if content they are working with is on hold.

To create a hold, a user must:

  1. Create, name and describe the hold
  2. Determine the hold location across:
    • Exchange Email: If an item is deleted in Exchange, the content will be placed in the hidden "Recoverable Items" folder
    • SharePoint Sites / OneDrive for Business Sites: If an item is deleted in SharePoint / OneDrive for Business, the content will be placed in the hidden "Preservation Hold" library
    • Exchange Public Folder
  3. Create a query for the hold
  4. Review and create the hold

Hold Queries follow the KQL Syntax which leverages boolean and other operators, as well as symbols for performing search. Read more on the KQL Syntax.

We often recommend to our clients to leverage locations for holds, but not queries (or at least not extremely specific ones). If your query is too narrow for a hold and it may not preserve content that it should. That content could be modified or deleted while you better-configure your query. The best approach is to preserve more content with your hold, then narrow the content analyzed through your searches.

screenshot 

4. Performing Content Searches and Exporting Results

Searching

Once holds have been created, eDiscovery Managers can then configure their search(es) that are relevant for the legal matter.

The query editing screen allows users to configure search elements (keywords, conditions, and locations) and execute a search. Search will apply query keywords to all content properties (file title, body, etc.) Users may also preview & export search results.

See below the Keyword and Condition areas:

screenshot 

The Search will also allow the Manager to preview the results returned, as seen below:

screenshot 

Once you are satisfied with the Search results, Managers can (from he quick managemnt fly-out > "more" drop-down) export the results for analysis, export a high-level statistics report, or prepare results for Advanced eDiscovery (we'll go into this more in another blog poste).

* Users with eDiscovery Permissions (Managers, Security & Compliance Admins, etc.) can also perform queries through the Content Search tool (Security & Compliance > Search & Investigation > Content Search). This enables you to freely search and configure queries without having to manage full eDiscovery cases. This is a good way to practice KQL syntax and refine search queries.

Exporting Search Results

Once you are ready to export the results of an eDiscovery Search, you will have several options regarding the format and structure of the output. Case Managers will have options around the following:

  • Handling of result items that have unrecognized format, are encrypted, or were not indexed for other reasons
    • Include items with unrecognized formats, are encrypted, or un-indexed items
    • Exclude items with unrecognized formats, are encrypted, or un-indexed items
    • Only export items with unrecognized formats, are encrypted, or un-indexed items
  • Handling of Exchange (Email) output
    • One PST per mailbox
    • One PST file for all messages
    • One PST file containing all messages in a single folder
    • Separate all messages individually
  • De-duplication of Exchange content (finds similar/duplicate emails and combines email threads into one item)
  • Including SharePoint file versions
    • If versioning is already enabled in SharePoint: all versions of an on-hold item will be copied into the Preservation Hold Library
    • If versioning is NOT enabled in SharePoint: all versions of the on-hold item AFTER the hold is applied will be copied into the Preservation Hold Library

screenshot 

5. Closing / Re-Opening Cases

I know what you're thinking, this was promised to be done in 4 steps, but I'm including step number 5 to close out the process for basic eDiscovery. All cases will eventually be closed, potentially re-opened, and ultimately deleted at some point. It's important to consider the effects of the holds, search results and cases themselves under each action. Below are the ramifications of each outcome of the case:

  • Closed: all holds are turned off and held content is released. Files that had previously been deleted while on hold are then released, which can result in a loss of that content. All configurations for the case are maintained and can be re-opened at any time.

  • Re-opened: existing holds and searches will be re-enabled, however, items changed while the case was closed are not retained.

  • Delete: a deleted case will lose all configuration settings permanently. All holds, searches and result exports will be deleted.

I hope you found this blog insightful and provided you with some tools to start working with eDiscovery in Office 365.

Enterprise Content Management; Governance; Admin; Office 365; Search; SecurityMichael Essling
  
4/13/2018 1:38 PM

In the last couple months, Microsoft has introduced a new feature to Office 365 called Site Scripts. They are, basically, a representation of steps to perform automatically after a site has been created.

What this means is you can automate boilerplate steps for creating sites like creating lists, adding content types to the lists, tweak navigation elements, execute a Flow and tap into Azure Functions to do some heavy lifting for provisioning new sites. You can even set a new site theme automatically.

screenshot

The site script itself is a JSON structure representing steps called “verbs”. Below is an example of a site script that on site creation, it creates a list called “Customer Tracking” and adds a handful of columns to this list.

script

Really straightforward process. You can view more details on the actions and verbs here: https://docs.microsoft.com/en-us/sharepoint/dev/declarative-customization/site-design-json-schema

Once you have a JSON structure of a site script, you then add it to your tenant via PowerShell. Make sure you have the latest SharePoint Online Management Shell as these are new features. Ensure you remove any previous versions as they will conflict.

The two commands are:

$SiteScript = Add-SPOSiteScript -Title "Create customer tracking list" -Content $siteScriptJSON -Description "Creates list for tracking customer contact information"

Add-SPOSiteDesign -Title "Contoso customer tracking" -WebTemplate "64" -SiteScripts $SiteScript -Description "Tracks key customer data in a list"

The first command adds the script to the tenant itself. The second command allows the script to be used.

Once the script is added and as a site design, you can go and create a site using the new template. After the site is created, you should see your script execute checking off items in a pane on the right as it completes.

As with any new feature Microsoft is adding more abilities to these site scripts so stay tuned.

Office 365; SharePoint Design and UXMike Schaefer
  
4/6/2018 10:03 AM

Since its inception, SharePoint has given its users a vast amount of control over how information is laid out and how the overall aesthetics of a site looks. This level of control has been incredibly beneficial in the corporate world because it allows for company branding (such as logos, color palette, etc.) commonly used in other public-facing materials such as advertising to be consistently applied. The ability to deliver a unified brand across disparate mediums and technologies is vital to producing a cohesive purpose and strengthening a company’s overall goals.

Current Approach

Traditionally, branding has been accomplished in a few ways. Depending on the complexity of the requirements, it may have been implemented through something as simple as a new stylesheet or theme that does nothing more than change the color scheme. For more complex endeavors, customization of master pages and page layouts were usually required to have more control over the “frame” of a site and introduce highly personalized elements such as custom headers, footers, and web part zones. This approach has been used with great success for over a decade and is still in heavy use on most on-premises sites active today.

With the introduction of Office 365, businesses were no longer required to maintain their own infrastructure and could instead choose to host their SharePoint implementations in the cloud. For many years this behaved similarly to on-premises implementations and users could introduce customizations through the approaches detailed above. Within the last few years, however, Microsoft began to rethink the way the user interface in SharePoint is working at its core.

The Modern Experience

It’s not a revolutionary statement to say that technology changes at a rapid pace. Because of this constant change, the ideas powering a user experience considered great one year may be seen as annoyingly antiquated the next. Microsoft has noticed specific web trends that many seem to appreciate and have made strides to incorporate them into SharePoint and update its overall user experience (seriously, did anyone ever really like having postbacks for every little thing?).

This effort has culminated in what Microsoft refers to as the modern experience. If you have interacted with the new team sites, list/library views, etc. you have most likely noticed that the overall design looks different.

1 Default Modern Experience look and feel

While the modern experience introduces new aesthetics (which are heavily appreciated!), there is something far more important happening under the hood.

The modern experience is unique in that it’s no longer a traditional aspx page with web parts that those of us using SharePoint have interacted with for well over a decade. A modern experience page is now a monolithic React application. Microsoft adopted React a few years ago as the preferred framework used to scaffold the new client-side SharePoint Framework web parts. React provides many benefits - most importantly in that it’s a battle-tested framework that is heavily used elsewhere on the web and has massive amounts of support.

The Problem

So at this point, you may be saying “This is fantastic! I want to begin using this everywhere on my corporate intranet immediately!” You then notice that your highly-customized master page doesn’t seem to be working on it. You also notice that the Alternate CSS URL you have on your normal publishing/team sites doesn’t seem to be working here either.

Since the pages aren’t traditional ASP-style affairs, virtually none of the tried and true branding methods work anymore (major exception being the SharePoint theming engine).

When the modern experience was first introduced, there was no real way to customize the look and feel of it in any significant way. Worse yet, many companies recognized the inconsistent user experiences could potentially confuse their users and disabled the modern experience altogether.

The Solution

Microsoft received tons of feedback on the modern experience and one of the major complaints was about the lack of customizability. To remedy this, they introduced SharePoint Framework Extensions in the latter half of 2017.

These extensions allow developers to introduce custom components into the modern experience pages. The extensions utilize the same nodejs-based build system that the SharePoint Framework web parts are currently made with. While it’s important to note that the extensions do not allow customization of every component, they do provide three main areas of control:

  • Application Customizers: these types of extensions allow for script injection onto the page and offer placeholders in commonly-customized areas of the page such as headers and footers.
  • Field Customizers: these types of extensions allow for modifications to views and field rendering. These are essentially JSLink technology of old, but for the new modern experience.
  • Command Sets: these types of extensions allow the construction of new actions. These actions are essentially analogs to the custom action overrides that were available in the older SharePoint experience.

Out of these three, the Application Customizers are the most that will most greatly benefit visual designers. The two main placeholders (header and footer) that have been implemented allow pages using the modern experience to be “framed” with common branding. Custom functionality such as a pop-out alert pane, bookmarks system, or personal calendar/reminder system can be implemented this way.

2 Custom branding elements and functionality within SPFX Extension Placeholders

More information about the SharePoint Framework Extensions can be found on Microsoft’s website at https://docs.microsoft.com/en-us/sharepoint/dev/spfx/extensions/overview-extensions. This site contains background information and invaluable tutorials on how to set up the build system, produce a sample project, and deploy it to your tenant. Sample code related to all three types of extensions can also be found in the excellently-maintained Github repository located here: https://github.com/SharePoint/sp-dev-fx-extensions.

SharePoint Design and UXWilliam Klingelsmith
  
3/16/2018 9:50 AM

What is OneNote for Business

OneNote is a digital notebook that exists in the Office 365 cloud. Take all of your plans, notes and ideas with you wherever you go. You can share your notes with colleagues to make sure everyone is on the same page. Using apps in both the Android and iPhone marketplaces you can access your notes while on the go!

Mimicking the way you are already used to taking notes, OneNote allows you to be creative with how you take notes.  Touch screen devices allow you to move content around with the touch of a finger.

Benefits

  • Access your notes from any device. Changes and updates are automatically saved to your local device and synched to the cloud so you never lose your content.
  • Use tagging capabilities to make important notes stand out.
  • Rich text formatting allows you to be creative about how you organize your notes. Adding tables and linking concepts together is easy.

OneNote Basics

  1. Install OneNote, and other office applications, by accessing your Office 365 portal and log in using your Office 365 credentials.

  2. Adding notes on OneNote is easy! Simply create a Section and Add a Page to get started. Add a title to help find your page later.

  3. Quickly Share your notes with others you are working with by putting it on SharePoint or OneDrive. In fact, many SharePoint sites already come with a group NoteBook that you can connect to from the OneNote Application.

OneNote User Interface Overview

The following areas will help you use and navigate your OneNote:

  1. The Ribbon – This is the same ribbon you see in all Microsoft applications. Each tab in the ribbon allows you to interact with content in a different way.
  2. File – Access the file properties, save, share, print etc.
  3. Notebooks – Select the specific Notebook you want to work in from the list of Notebooks you have on your local device as well as in the cloud.
  4. Sections – Organize your notes into different sections. Each tab represents a new section.
  5. Add a new section – Add a new section to the current Notebook.
  6. Search – Search your Notebooks for key words and other attributes.
  7. Page Title – Title of this page within the current Section in the currently selected Notebook.
  8. Add page – Add a new page to this Section of the Notebook.
  9. Page Selection – Take action on the selected page.
General; Office 365; Tips & TricksJosh Jackson
  
3/2/2018 2:22 PM


​What is OneDrive for Business?

OneDrive is a tool for managing and sharing documents across devices. It operates much like a file share system with many more benefits. OneDrive is one of many tools available to you through Office 365.
When a document is added to OneDrive the document is stored in the Office 365 cloud, making it available almost anywhere.  When you edit the document the changes are automatically stored in the cloud. You control access to your documents, you can share your documents with people inside and outside of your organization.
Benefits
•  Quick storing and accessing of documents across all your devices.
•  Cloud storage and offline capabilities make your documents available anywhere!
•  Share documents with only the users you select by sending them a link.
•  Use apps for your Android or iPhone to access your OneDrive.

OneDrive for Business Basics
1. To access OneDrive go to your company’s Office 365 portal and log in using your Office 365 credentials. 


Helpful Hint: You can access Office 365 from any computer with an internet connection.  Go to: https://login.microsoftonline.com/ and provide your work Email Address and Password to connect to OneDrive.

2.  Adding Files to your OneDrive is easy! Simply select the files you want to add and “drag and drop” them into the desired folder.
3.  Quickly Share files and manage access with both internal and external users:



OneDrive Home Page Overview


 
The following areas will help you use and navigate your OneDrive:
1.  New – Add Office Documents to your OneDrive, depending on your licensing certain applications may not be available. Use Folders to organize your folders and separate who you share them with.
2.  Upload – Add documents you select from your file share or local drives.
3.  Flow – Create simple workflows to send notifications, synchronize files and much more!
4.  Sync – synchronize your files between your local computer and the cloud  


5.  Files – Select files for actions in the ribbon. Use the other attributes such as Name, Modified By and File Size to find specific documents.
6.  Search – Search your documents for key words and other attributes.
7.  Quicklaunch -
  Files – Sort through your files in OneDrive.
  Recent – Look at the most recent documents you
  Shared with me – look at documents other users of OneDrive have shared with 


  Recycle Bin – Recover files you have deleted for up to 93 days after they have been deleted.
  Office 365 Groups – View existing or create a new Office 365 Group.

Office 365; General; Tips & TricksJosh Jackson
  
3/2/2018 10:37 AM


The Content Type Hub is a great way to keep all of your Content Types in sync across your environment; this includes the Names, Columns, and the GUIDs, which makes search that much more powerful within your organization. Since Microsoft is moving to using site collections for Modern Sites a Content Type Hub is almost necessary to get the most out Office 365 and SharePoint implementations. As the Content Type Hub will play an important role in the future of SharePoint I wanted to mention the top 5 things to keep in mind while using the Content Type Hub.

1.    Remember to Publishing and Re-Publish your Content Types after making changes
The most common mistake I see is people forget to publish their Content Types and end up wondering why their updates are not reflected in other Sites. So whenever you make an update take that extra step to publish down your changes.



2.    Unpublishing a Content Type will not remove the content type from a Site Collection
Unfortunately, there is no “Undo” button after a Content Type has been published. If you truly need to remove a Content Type, you will have to go from Site Collection to Site Collection to remove it. Unpublishing will just prevent it from being downloaded by the Web Apps/Site Collections in the future. One of the key mistakes I see here is a Content Type will be published and then a user realizes that the wrong parent was chosen; users will then unpublished and delete this Content Type and create a new one of the same name, which of course will cause an issue when the Content Type is published. When planning on using the Content Type Hub be sure to properly plan your Information Architecture to avoid such conflicts.



3.    Don’t create a Site Content Type with a Name already in use by the Content Type Hub (and vice versa)
Sometimes you have to use a Site Content Type, or maybe one already exists due to some legacy data. Just remember to keep the names unique otherwise there will be an issue publishing the Content Type.


4.    View the content type publishing error log
I feel that while this may be self-explanatory many people forget that the error log exists when troubleshooting any issues with the Content Type Hub. Navigate to Site Settings, then select Content Type Publishing under the Site Collection Administration heading. 



5.    If you are maintaining multiple environments, such as a Development, QA, and Production, be sure to keep them all sync.
This can be done via a content database back-up and restore or a shared service application. In either scenario, be sure that the Content Type Hub URI is set correctly. 


General; Tips & TricksMatt Lavieri
  
1/4/2018 2:15 PM


For many organizations moving to Office 365 or other Cloud services, the concepts of security, compliance and risk are complex. They require learning about how these security concepts have changed and how they’re now implemented in a Cloud first, Mobile friendly world. They often require working with security experts to evaluate the current state of the security for the Cloud application that you’re concerned about… and determining which security capabilities and features you are and are not yet making use of.

When we worked in on premise server environments, things seemed almost easier in some ways because our server farms which hosted Exchange, SharePoint, Skype for Business and so on were all within our corporate networks. They were more under our control, and we felt some level of comfort from being able to stop internet traffic at the network boundary, usually through our firewalls or gateways.
Regardless of how truly secure our not our networks and applications in fact were, we often gained some comfort from this boundary.

With the advent of Cloud computing, with the desire to do work on a whole range of Mobile devices, even our own personal devices, and with the desire to access our services for work from anywhere in the world, moving to services which are hosted on servers and in data centers that are not under our control often feels like we’ve lost that comfort… that assurance that we’re controlling the security of our critical IT services, or it feels that we’ve given the management of our security over to someone else (that we can’t see, that we can’t talk to and that we don’t know).

When, in actuality, often services like Office 365 are more secure than we could have ever hoped to deploy in our own environments… often we have more control over how our services are secured than we’ve ever had. We often just aren’t aware yet of the security benefits that come out of box with Office 365, and we’re not aware of the security capabilities that are available for us to use.

Office 365 Secure Score is a security analytics tool from Microsoft that comes with your Office 365 subscription. Its built to help us understand and navigate all of the security options that are available to us in Office 365.

It’s a relatively new feature from Microsoft, released early this year. Its purpose is really to:

  • Help us understand our current security posture
  • Help us understand which security features we are using and not yet using
  • Help us understand the impact of rolling out new security features to our end users and administrators, and what the security benefits are to us
  • Help us understand how we can improve our security posture, and it even tracks our progress over time

View a video recording of this presentation here.

View the slides used here.

Security; Office 365Antonio Maio, Microsoft MVP
  
12/18/2017 9:32 AM

If only our technology and its users could all operate flawlessly!

Until that day arrives, however, at one point or another you’ll probably need to report a technical issue to a SharePoint support team. Regardless of the specific team you use, we all need the same kind of information from you, the user, in order to trouble-shoot what’s gone wrong. The more clearly you can describe your problem, the quicker they can respond and the less you’ll pay for a fix.


Here are some of the basic pieces of information that you should consider including when you describe your incident to a support team:


Your version of SharePoint
O365, SP2016, SP2013, SP2010 all have different features, design structure and functionality. Make sure that any developer picking up a ticket will immediately know what they’re dealing with.


URLs where the issue can be seen
Provide URLs to all sites, web pages or lists and libraries that may be relevant.  


Steps to reproduce the issue
If the issue isn’t immediately viewable at the URL, detail the steps that need to be followed to reveal the issue. 


Browser type and version
If there’s a design or layout issue, knowing the user’s browser specs can be helpful.  


Screenshot of what you’re seeing
What you see is not necessarily what other people will see. If feasible, provide a screenshot of alert messages or layout issues. 


Which users can see the issue… and which can’t
If only some users experience an issue, provide account names so their permissions and account properties can be compared. 


Recent changes to your server farm
If the issue appears shortly after server updates or any other system changes, include the date/time stamp of when they were applied. 


Important deadlines
If you’re trying to meet an important deadline, let support know! If there are times when the site shouldn’t be reset, even for just a few minutes of downtime, let support know that as well. 


Who to reach and where
For emergency issues that may stretch outside of normal working hours, let support know how and when you want to be informed of the outcome.

There’s no one-size-fits-all checklist, but you can’t go wrong using this as a starting point.

Project Management; General; Tips & Tricks; Protiviti On Demand; SharePoint DevelopmentCarmen Carter
  
12/18/2017 9:27 AM

A common practice we are seeing since vulnerabilities were discovered in the TLS 1.0 protocol is network and server admins disabling SSL 3.0, TLS 1.0, and sometimes TLS 1.1.  SharePoint 2016 supports this out-of-the-box, but SharePoint 2010 and 2013 require significant changes to all servers in the farm. Client machines may potentially need patched as well depending on what version of Windows they are running and what browser they use to access your SharePoint sites.  The official documentation describing what needs to be done is located at https://technet.microsoft.com/en-us/library/mt773992(v=office.14).aspx for SharePoint 2010 and at https://technet.microsoft.com/en-us/library/mt773991.aspx for SharePoint 2013.  Note that if you disable TLS 1.0 and TLS 1.1 without following these steps, the web applications in the farm will fail to load.

Protiviti On DemandJared Rich
  
12/18/2017 9:21 AM


Last year, my team at Protiviti built an Angular 2 single-page-application and we decided to try an exciting new JavaScript library with it: ngRx. The short and sweet of it is that even though Angular 2 renders far faster than Angular 1, it still lags behind React, Facebook’s popular rendering engine. ngRx attempts to bridge this gap by implementing React’s “Flux” design pattern for Angular. If you’re unfamiliar, Flux allows you to only update pieces of the UI when their observed data changes, making your rendering very efficient. Unfortunately, I walked away from the project feeling unsure about ngRx. On one hand, I saw the potential performance gains and it also forces you to code in a clean, structured way that’s less error-prone than usual. On the other hand, I watched as my team and I struggled to get over the difficult learning curve and I couldn’t help but notice that even simple tasks required a lot more code with this pattern. I wasn’t sold on it. Anyway, I moved on to other projects and let ngRx sit in the back of my mind for a couple months. That brings me to this week. I’m in Chicago’s JavaScript Meetup group and this week, one of the talks was going to be about ngRx. This was my chance. For better or worse, I would have my catharsis!

The first talk was about another JavaScript library, RxJS, and this is actually really important for ngRx. The “Rx” in “ngRx” actually refers to RxJS, as ngRx is built around it. Here, I found my first problem. I simply didn’t have a good enough grasp on RxJS. I had used it throughout the ngRx project but I had no idea of how useful it could be. Every time I read an article on RxJS, it read like Doc Brown from Back to the Future rambling about how amazing streams are so a lot of it went over my head. This presentation finally slid things into place for me. RxJS really just provides you with all sorts of helpful utility functions, but forces you to look at things as observables or streams rather than discrete data, which can be a different way of thinking. That said, given how prevalent events and asynchronous actions are in modern JavaScript, and that those are perfect candidates for observables, this is absolutely a library worth your time. I would later find out that my main technical hurdle with ngRx could have been solved just by using an RxJS function…. D’oh!


So then came the ngRx talk. The moment I had been waiting for. And the speaker was ready for me too!



I chuckled out loud as I had had these exact debates with my teammates before. But he got right to it, not only being a fan of the library but also clearly having a mastery of it. Though he didn’t offer any earth-shattering revelations, he had a number of great suggestions for using the framework:

  • You don't have to use ngRx and Flux for every component. If a component doesn't have a lot of crazy UI rendering and has a simplistic state model, then you don't need to use it if you don't want to. This is also important, because this means you can add ngRx to your app after-the-fact and not have to worry about redoing the whole site
  • Implement a rigid folder structure, separating out your reducers, actions, effects, and selectors. This is critical as you scale
  • If you use the Redux dev tools, you can time travel your model for debugging, e.g. "what was the previous state?" I never used this, unfortunately, but I wish I had
  • He mentioned a concept called “smart & dumb components” that I really liked. This idea was popularized with React but fits perfectly with ngRx too. The idea is to separate your components into two types:
  • Smart Components: these know how things work, hold the reference to the state, call flux actions, etc. A "container component"
  • Dumb Components: these deal with how things look, have little if any state, minimal dependencies, etc. A “presentational component,” e.g. a footer of your page
  • Don’t use 2 stores. My team and I had talked about potentially doing this but the speaker said this is more trouble than it’s worth
  • Lastly, he demoed ngRx Effects. Here, he subscribed to an action (more RxJS) and was able to do some cool GUI stuff in response to the action

Though I still can’t say I’m a diehard fan of ngRx, this talk assuaged a lot of my concerns with the library. And to be fair, the speaker acknowledged the learning curve but countered it by saying he now feels he develops applications faster in the long run with ngRx: easier to alter plus less bugs and regressions. Anyway, I’m glad I was able to get some fresh perspective on ngRx and I hope these tips help you if you choose to give ngRx a try yourself. Thanks for reading.


Speaker’s Article that the presentation was based on: http://brianhudi.org/2017/11/15/adding-ngrx-to-existing-projects-is-easy/
ngRx: https://gist.github.com/btroncone/a6e4347326749f938510
RxJS: http://reactivex.io/rxjs/

Mike Bestvina
  
11/9/2017 11:17 AM


One of the lesser known tools in the Microsoft Office 365 toolkit is Microsoft Sway. My only impression of Sway was that was PowerPoint on steroids, so I decided to learn a little more about it at Microsoft Ignite in Orlando. I learned that although similar, Sway and PowerPoint are unique and each has its own use cases.
Sway makes it quick and easy to create and share polished, interactive reports, presentations, personal stories, and more. The tool has been around for two years and is cloud based, so users do need to be connected to the internet to access it. Sway data is stored in Azure and although there is no way to download it, you can export it to PDF but then you lose the whole interactive webpage feel from the experience.

Users can either create a Sway from scratch or leverage existing content and all presentations and reports are responsive. Using the built-in design engine, users can easily add content, design and format. Sway even suggests searches via Bing to help you find relevant images, videos, tweets, and other content that you can drag and drop right into your creation. No need to juggle apps and web pages to find what you want.


But, for those who like to control every element of a presentation, Sway might not be for you. It certainly has limitations on design. You don't have the ability to control each object in your Sway presentation - other than hitting remix and allowing the system to generate a new option. So if you are a pixel perfect designer, this could be frustrating for you.
 
To create a Sway - you start by providing the elements:


Then, let Sway do the work to generate possible options using the Remix button until you find the one you like:
 




So to pull together a quick, engaging presentation that will be shared with many - Sway is a great option. As it is cloud based, there is no version management - anyone accessing your presentation will see the most recent version.
 
Sway Presentations and Reports can be shared with just a few people or made public. There are some great analytics on who has viewed your Sway, how engaged they were - how far through the Sway they read. This could be useful for a company policy document that everyone is required to read.
 
Sway can help make a dry topic more engaging. You can embed videos, tweets, other sways, Power BI dashboards and even create forms. You can add slide shows, videos, stack images and add movement and interaction to make the presentation more engaging.
 


You can also set color schemes to flow through the Sway. It can be vertical or horizontal.
 


 
As an Admin, you can control who in your organization can access Sway, block content and allow or prevent external sharing.
 
So do a search for public Sways and be inspired - here is the one from the Sway team at Microsoft Ignite https://sway.com/4dLxKa4BR2nBLWCF
 
Here are some links to more information from Microsoft on Sway:
Getting Stared with Sway: https://support.office.com/en-us/article/Getting-Started-with-Sway-2076c468-63f4-4a89-ae5f-424796714a8a
FAQ's: https://support.office.com/en-us/article/Frequently-Asked-Questions-about-Sway-d550100d-74c1-48b5-94dd-de8cc9b694e9

General; Office 365; Tips & TricksClaire Prosser
  
11/9/2017 10:49 AM
 

Client Challenge:

An International Finance Company turned to Protiviti to automate their document approval process and create a centralized location for all of their policy and procedure documents for their compliance department.

Previously, their approval process was done manually and faced a number of issues. All Policies and Procedures were stored in different document libraries on a variety of sites. Therefore, all users of these documents were not able to easily find their documents. There was no version control process in place when multiple individuals were editing documents offline and emailing reviewers their updated versions. Lastly, there was an unorganized, sometimes non-existent, historical trail of approvals which was becoming an issue during audits.

Our client asked for a solution that would comply with a list of regulations. They needed these documents to only be accessible to particular groups based on their department. The metadata about these documents could only be updated by their Compliance Department and was expected to change on a regular basis. Each approval needed to occur on a scheduled review date or on an adhoc basis. The approvers would vary based on the document. During a review process, the draft versions would only be accessible to reviewers and approvers, but they needed to ensure the draft submitted for approval could not be edited during the process. If approved, general users needed to easily find these documents. Once completed, the approval and versions of the document needed to be recorded and easily accessible. 


Solution Provided:

After considering all of the client’s needs, we were able to create a solution using OOTB features of SharePoint 2013 and Nintex Workflows/Forms.

To manage security and searchablitiy of these documents, we created two document libraries for Policies and Procedures. These libraries included department folders where documents were uploaded and inherited the permissions of the folder. Each document was tagged with a managed metadata term for their department to easily search for their documents within the libraries. All additional metadata, including approvers for the document, was updated by the Compliance Department in a Master list. To sync the metadata with a document, we used a list lookup for the title field and workflows to bring in all data for that document title. 

The approval process involved enabling content approval and versioning on the library and using Nintex workflows. When content approval was enabled, a user was able to “publish” a document, to submit for an approval. When a user published the document as a major version, an adhoc approval workflow began involving multiples approvals. We used versioning to check out the document to a service account. This check out ensured no users were able to edit the document during the review. Once approved, the document became published as a major version in the library and was visible to all users. A similar process was used during a scheduled review but was automatically started based off of the “Next Review Date” column in each library. A site workflow ran every day to compare each documents “Next Review Date” with today. 


Results:

Our team successfully created a solution which complied with all of the client’s regulations and helped to solve their previous pain points. Since we used OOTB SharePoint and Nintex Workflows, each approval had a clear historical trail of all versions and approval processes. The client reported positive feedback during their annual audit with the new system. The client later requested to add additional features such as an Effectiveness Review Form. Moving forward, this process can easily be replicated for other clients that face similar issues with their document approvals.

General; Governance; Protiviti On Demand; User AdoptionKelsey Butler
  
11/1/2017 2:57 PM

 
The General Data Protection Regulations or GDPR, is a new data protection and privacy standard that was released in recent years by the European Union. The GDPR was designed to strengthen and unify data protection for residents of the European Union regardless of where in the world that data is used, stored, or processed.  As a result, it affects all organizations worldwide that store or process personal data for European residents. 
 
The enforcement deadline is May 25, 2018, and for organizations that have EU employees or customers, it will impact virtually every department and C-suite executive, including the General Counsel’s Office, Chief Compliance Officer, Chief Technology Officer, Chief Financial Officer, Chief Information Security Officer and, Human Resources. 
 
Join Protiviti and Microsoft in our complimentary GDPR roundtable series to speak with our experts and learn about how the Microsoft Cloud and Protiviti can help you:
•    Understand the scope of GDPR regulations and how they may apply to your business
•    Learn how to Use Microsoft Cloud capabilities to help you become GDPR compliant
•    See how investments in the Microsoft Cloud are helping it to become GDPR compliant
 
Robert Half Legal and Protiviti help global organizations define and manage compliance programs for the EU General Data Protection Regulation (GDPR).  With our combined services, customers benefit from Protiviti’s data privacy and security expertise and access to Robert Half Legal assistance in translating legal requirements, standards and guidelines into risk-adjusted operational frameworks, policies and procedures to meet compliance requirements.  Protiviti’s deep experience with Microsoft Cloud solutions positions us as an ideal partner to help Microsoft customers move to Office 365 and Azure, while implementing compliance with GDPR requirements.
 
The goals of this roundtable series are to educate you on GDPR requirements and how your organization may be impacted, raise awareness on Microsoft’s commitment to meet GDPR regulations with its Microsoft 365 offering by the enforcement deadline and demonstrate to you may use Microsoft’s innovations in security and compliance to meet your GDPR requirements.
 

We have two events happening this month in the following cities:

 

Vancouver BC, Canada

Monday November 20th – 12:00 to 2:00pm
 
Location:
Microsoft Technology Center
1111 West Georgia Street, Suite 1100, Vancouver
 
Speakers:
Antonio Maio, Protiviti
Antonio is a Microsoft MVP, with a specialization in Office Server and Services. He is active in the Microsoft community, and a frequent speaker at Microsoft and other industry conferences.  Antonio is a national technical leader within Protiviti, focused on helping customers with architecture, development and security related projects using Microsoft technologies.
 
Chris McNulty, Microsoft
Chris is a Senior Product Manager for SharePoint & Office 365 and is a recognized leader in Microsoft technologies including SharePoint, SQL Server, Project Server, Active Directory, .NET, and Exchange.

If you're interested in attending our complimentary Roundtable in Vancouver, click here.
 


Toronto ON Canada

Wednesday November 22nd – 12:00 to 2:00pm
 
Location:
Microsoft Office
222 Bay Street, Suite 1201, Toronto
 
Speakers:
Antonio Maio, Protiviti
Antonio is a Microsoft MVP, with a specialization in Office Server and Services. He is active in the Microsoft community, and a frequent speaker at Microsoft and other industry conferences.  Antonio is a national technical leader within Protiviti, focused on helping customers with architecture, development and security related projects using Microsoft technologies.
 
Steve Vandenberg, Microsoft
Steve is an Architect within Microsoft’s Cybersecurity Practice and is skilled at developing and articulating security vision and roadmap for technical and nontechnical stakeholders.

If you're interested in attending our complimentary Roundtable in Toronto, click here.
 

GDPR Roundtable Agenda:
•    12:00: Event Check In
•    12:00-12:30: Complimentary Lunch & Networking
•    12:30-1:45: Presentation
•    1:45-2:00: Q&A Discussion
 
We look forward to talking to you more about you will meet your GDPR obligations!
 

General; Events; Security; GovernanceAntonio Maio, MVP
  
10/19/2017 9:00 AM


The only option SharePoint 2016 allows for User Profile Synchronization is Active Directory Import.  


Active Directory Import is a fast and reliable option and is easy to configure. However, there are certain limitations to using this feature. These are:
1. Import is unidirectional (changes go from Active Directory to SharePoint Server Profile).
2. Import from a single Active Directory forest only.
3. Does not import user photos automatically.
4. Supports Active Directory LDAP only.
5. Multi-forest scenarios are not supported.

Previous versions of SharePoint Server (SharePoint 2013 and older versions) had by default supported all the scenarios listed above. This was made possible by ForeFront Identity Manager (FIM) which was built-in the SharePoint Server product. FIM powered the User Profile Synchronization in these versions. Starting with SharePoint Server 2016, FIM is no longer included as part of the SharePoint software. The recommendation from Microsoft is to instead use Microsoft Identity Manager 2016.

If you foresee needing any of the features listed above, then an implementation of Microsoft Identity Manager (MIM) is required.

Microsoft Identity Manager(MIM) is a separate server technology that works independent to SharePoint Server. MIM is the successor to Microsoft’s Forefront Identity Manager.

MIM supports the following:
1. Flexibility for customized import.
2. Can be customized for bidirectional flow.
3. Imports user profile photos automatically.
4. Supports non-Active Directory LDAP sources.
5. Multi-forest scenarios are supported.

Implementing MIM requires the installation and configuring of additional servers. Additional details on Microsoft identity Manager in relation to SharePoint Server 2016 can be found here: https://technet.microsoft.com/en-us/library/mt627723(v=office.16).aspx?#BKMK_WhatIsMIM1

SharePoint 2016; Admin; GeneralAjay Ram
  
9/27/2017 11:17 AM

Microsoft is constantly evolving, and now predicts that by 2018, a whole 30% of its revenue will be accredited to cloud products. And leading those cloud products is the Office 365 platform.

As cloud technology has improved, Microsoft have been able to move away from solely on-premises products like SharePoint and the Office Suite, and re-building them for the cloud on a unified platform. In doing so, they’ve offered companies of all shapes and sizes the opportunity to migrate to the cloud and witness its inherent benefits.
 
In this post, we’ll explore five key reasons you should consider migrating to Office 365.
 

1. Flexibility

 
Available across all your devices
Office 365 means your colleagues no longer need to be chained to their desks to access company data and be productive. It’s possible to work on smartphones, tablets and laptops, all backed up by comprehensive security measures.
 
Improved productivity
There are a number of ways in which Office 365 can encourage a more productive workforce. Cross-platform support means users can access tools from whichever device they’re using—PC, Mac, iOS and Android devices—while anytime/anywhere access to Word, PowerPoint and Excel files means users can get work done no matter their location. Advanced document sharing and collaboration features add to this, allowing users to work together on files while staying on the same page.
 
Empowering users
Office 365 is available in the cloud, yet also has the flexibility of integrating with your company’s on-premises solutions. Take SharePoint 2016, for example. Hybrid functionality means users can conduct a unified search that sources information from your on-premises environment and Office 365 in the cloud.
 

2. Lower costs and scalability

 
In capable hands
Moving your business email to the cloud places maintenance into the hands of Microsoft, meaning you have thousands of the most highly qualified engineers and developers monitoring and protecting some of the most secure data centers on the planet. This means that you spend more time working on your core business, rather than actively fighting fires as they arise.
 
Only pay for what you need
Office 365 works around the needs of your business, and that’s reflected in its pricing model. When your organization grows, you can simply pay for the additional services and data storage you need.
 

3. Collaboration

 
Collaborate with colleagues
Whether in real time or separately, colleagues can work together on files to edit, share and refine documents to the highest standard. Document co-authoring is possible in Word, PowerPoint and Excel Online, where users can see where exactly other users are working on a document and view the changes they’re making.
 
Keep documents synced with OneDrive for Business
OneDrive for Business is a personal document store within SharePoint, and comes with a host of user controls and reports. It is available in the cloud within Office 365 and for ‘on-premises’ SharePoint installations—but users can benefit from rolling updates with the cloud-hosted platform.
 

4. Automatic updates

 
Keeping updates rolling
In the past, getting hold of the newest versions of Microsoft products involved the wholesale purchase of new licenses. But Office 365 subscribers have constant access to the latest versions of Office products, regardless of deployment type.
 
Advantage on your investment
Software Assurance is a sort of insurance policy that lets organizations receive the latest versions of Microsoft products, among other features, without having to buy all new licenses. This can reduce software and service costs with rights to new software releases and cost-efficient upgrades. 
 

5. Data security

 
Got your back
There are a number of measures Microsoft takes to ensure their customer data is as secure as possible. For starters, Microsoft does not disclose their facilities’ location to the public. Data is kept secure using BitLocker encryption, HTTP Secure and Information Rights Management (IRM) on Document Libraries. Office 365 also requires strong user passwords to make sure a brute force cracking attempt doesn’t lead to a security breach.
 
More control
In terms of managing company data internally, Office 365 administrators can control what is seen (and by whom) in your organization. Furthermore, Microsoft employs multiple layers of redundancy and backups of information at the datacenter level for it to be restored if necessary. So, you’re in safe hands.
 

Continued growth

 
Benefits like these account for some of the reasons why Office 365 has seen massive growth in the past few years. Thousands of small and medium-sized enterprises have made the move to Office 365 in the cloud, but the process of migration is a bigger task for larger organizations, whose mass of documents and working resources are seemingly cemented in their on-premises environments. For those companies, or those unsure of the details of a migration, the assistance from migration professionals can save you time and money.
 
Guest Blog Post by Sharegate

General; Office 365; MigrationGuest Blog Post by Sharegate
  
9/27/2017 11:10 AM


Microsoft is constantly expanding its Office 365 offerings. When discussing SharePoint specifically, we’ve all heard of the major overhaul changes to the UI that are coming. This is all very good news as in addition to the visuals, the back-end is getting a major overhaul. I’d like to shed some light on some of the JavaScript additions and changes that have made their way into the SharePoint Classic / Publishing pages.

_spPageContextInfo:

This handy variable has expanded quite a bit in Office 365, especially compared to 2013.  There are many things asked of us on the client development side and SharePoint always makes it difficult to fulfill every request.  The _spPageContextInfo variable helps immensely with boilerplate data fetching and the 365 changes are signifigant.  Check out the zoomed out diff screenshot.



 
My favorites compared to 2013 include:

  • User’s Time Zone Setting (_spPageContextInfo.userTimeZoneData)
  • Web’s Time Zone Steting (_spPageContextInfo.webTimeZoneData): If user doesn’t have one set, fall back to this
  • User’s login Name (_spPageContextInfo.userLoginName)
  • User’s Display Name (_spPageContextInfo.userDisplayName)
  • List ID where current item is from (_spPageContextInfo.listId)
  • Group Color (_spPageContextInfo.groupColor)
  • Site Color (_spPageContextInfo.siteColor)

Office 365; Tips & TricksMike Schaefer
  
9/5/2017 1:55 PM


You have probably heard about the new General Data Protection Regulation (GDPR) that will be coming into effect May 2018. This new regulation was initiated by the European Commission in 2012 and finally generally agreed upon by the European Parliament and Council in December 2016. The GDPR is set to replace the current Data Protection Directive 95/46/ec.

Most companies have already adopted privacy processes and procedures consistent with the Directive, but the GDPR contains a number of new protections for EU data subjects and threatens significant fines and penalties for non-compliant data controllers and processors. There are some core areas that are of great importance when trying to understand this new policy, as well as seeing how it fits into existing policies and also platforms that you may be using.

Some of the core areas that are covered by this new plan are:

  • Cybersecurity and data breach notifications
  • Data protection officer requirement
  • Data consent
  • Cross-border data transfers
  • Data Profiling
  • Data portability
  • Vendor management
  • Pseudonymizing of personal data
  • Codes of conduct and certifications
  • Violations


Looking at these topics you may wonder how this fits into Office 365 and what you as an organization can do about it?

In reality, if you are already using Office 365 and Azure Services then you are part way there.



Office 365 combined with Azure services provides you with the services you need to meet the GDPR requirements. Office 365 offers core features such as Data Loss Prevention, eDiscovery, Customer Lockbox and Advanced Data governance services for controlling the data that you store within all components. To meet some of the core protection GDPR policies, Office 365 provides Advanced Threat Protection, Advanced Security Management, Audit Logs and overall Threat Intelligence.

To learn more about these areas of Office 365, head over to these posts:

Security and Compliance
http://sharepointpromag.com/sharepoint/security-and-compliance-office-365-administration
http://sharepointpromag.com/sharepoint/security-and-compliance-office-365-security-and-permissions

Data Loss Prevention
http://sharepointpromag.com/sharepoint-administration/security-and-compliance-office-365-data-loss-prevention
http://sharepointpromag.com/sharepoint/data-loss-prevention-sharepoint-premises-and-online

eDiscovery
https://support.office.com/en-us/article/Set-up-an-eDiscovery-Center-in-SharePoint-Online-A18F8975-AA7F-43B4-A7D6-001D14744D8E
https://support.office.com/en-us/article/eDiscovery-in-Office-365-143b3ab8-8cb0-4036-a5fc-6536d837bfce
https://support.office.com/en-us/article/Plan-and-manage-cases-in-the-eDiscovery-Center-d955aeb8-0d48-4291-a8e2-f3b84f17943f
https://technet.microsoft.com/en-us/library/fp161516.aspx

Customer Lockbox
https://blogs.office.com/2015/04/21/announcing-customer-lockbox-for-office-365/
https://support.office.com/en-us/article/Office-365-Customer-Lockbox-Requests-36f9cdd1-e64c-421b-a7e4-4a54d16440a2

Advanced Data Governance
https://support.office.com/en-us/article/Data-governance-in-the-Office-365-Security-Compliance-Center-5fe09846-41b6-4168-9c48-2eb491b69dc2

Advanced Threat Protection
https://www.helloitsliam.com/2015/08/31/microsoft-advanced-threat-analytics-setup-part-1/
https://www.helloitsliam.com/2015/08/31/microsoft-advanced-threat-analytics-setup-part-2/

Advanced Security Management
https://support.office.com/en-us/article/Overview-of-Advanced-Security-Management-in-Office-365-81f0ee9a-9645-45ab-ba56-de9cbccab475
https://blogs.office.com/2016/06/01/gain-enhanced-visibility-and-control-with-office-365-advanced-security-management/

Audit Logs
http://sharepointpromag.com/sharepoint/security-and-compliance-office-365-reporting

Threat Intelligence
https://blogs.office.com/2017/04/04/announcing-the-release-of-threat-intelligence-and-advanced-data-governance-plus-significant-updates-to-advanced-threat-protection/
https://support.office.com/en-us/article/Office-365-Threat-Intelligence-overview-32405da5-bee1-4a4b-82e5-8399df94c512
https://blogs.office.com/2017/02/10/new-office-365-capabilities-helps-you-proactively-manage-security-and-compliance-risk/

Advanced Information Protection
https://www.microsoft.com/en-us/cloud-platform/azure-information-protection

In reality, we are yet to see how this will impact businesses, or you as an individual. What we do know is that every organization that does business in Europe is affected. GDPR will fundamentally change the way you do business by enforcing the regulations listed below. The biggest three to affect companies here in the US, will be “Consent“, “Right to be forgotten” and “Privacy by Design“. This will be an interesting year as you head into the new GDPR-world, I am sure there will be many lessons to learn over the next few months into next year.

Each of sections within the GDPR cover a vast array of areas, the following paragraphs outline at high level what each category covers.


This information has been gleaned from the following sources, some copied as is, some just used for reference.

Sources and Citations
http://www.eugdpr.org/
https://www.privacy-regulation.eu/en/index.htm
https://iapp.org/resources/topics/eu-gdpr/
https://en.wikipedia.org/wiki/General_Data_Protection_Regulation
https://ico.org.uk/for-organisations/data-protection-reform/overview-of-the-gdpr/
https://www.protiviti.com/sites/default/files/united_states/protiviti-flash-report-final-gdpr-053116.pdf
https://www.protiviti.com/US-en/node/73266
https://www.microsoft.com/en-us/TrustCenter/Privacy/gdpr/default.aspx
https://blogs.microsoft.com/on-the-issues/2017/02/15/get-gdpr-compliant-with-the-microsoft-cloud/
https://blogs.microsoft.com/blog/2017/05/24/accelerate-gdpr-compliance-microsoft-cloud/

 
Cybersecurity and data breach notifications
Data security is of upmost importance for all companies or at least should be, based on the events over the past few years. The GDPR imposes much stricter rules around data processors and controls for data security. The GDPR separates responsibilities of data controllers and then data processors ensuring and in reality, forcing each to provide assurances that sufficient technical and organizational measures are taken with data security. Article 32 of the GDPR outlines the list of appropriate measures.

https://www.privacy-regulation.eu/en/32.htm

As well as ensuring that these are met, in the event of a breach the GDPR rules differ slightly from current rules. Under the GDPR, a “personal data breach” is “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored or otherwise processed.” This broad definition differs from that of most U.S. state data breach laws, for example, which typically are triggered only upon exposure of information that can lead to fraud or identity theft, such as financial account information. Article 33 or the GDPR outlines the requirements.

https://www.privacy-regulation.eu/en/33.htm

One of the goals of the GDPR is to harmonize the data breach notification approaches. Within the US, most states have different notification laws, which of course can and has caused many issues. The GDPR, at least within EU member states will provide a predictable and efficient process for breach notifications.

 

Data protection officer requirement
Under Article 37, of the GDPR, organizations must designate a data protection officer, for all public authorities. Article 37, does not outline the credentials that data protection officer must carry, but does require they have “expert knowledge of data protection law and practices“.

https://www.privacy-regulation.eu/en/35.htm
https://www.privacy-regulation.eu/en/37.htm

As listed within Article 37, the core data protection officer tasks are:

Informing and advising the controller or processor and its employees of their obligations to comply with the GDPR and other data protection laws.

Monitoring compliance with the GDPR and other data protection laws, including managing internal data protection activities, training data processing staff, and conducting internal audits.

Advising with regard to data protection impact assessments when required under Article 35.

Working and cooperating with the controller’s or processor’s designated supervisory authority and serving as the contact point for the supervisory authority on issues relating to the processing of personal data.

Being available for inquiries from data subjects on issues relating to data protection practices, withdrawal of consent, the right to be forgotten, and related rights.

Under the Regulation, data protection officers have many rights in addition to their responsibilities. They may insist upon company resources to fulfill their job functions and for their own ongoing training. They must have access to the company’s data processing personnel and operations, significant independence in the performance of their roles, and a direct reporting line “to the highest management level” of the company. Data protection officers are expressly granted significant independence in their job functions and may perform other tasks and duties provided they do not create conflicts of interest.

Data consent
When working with data, one area that is of utmost importance is that of “consent“. Consent is the lawful basis to transfer personal data. Within the existing policies, controllers were allowed to rely on implicit and “opt-out” consent in some circumstances. The new GDPR changes what is currently used in three specific ways. First, the GDPR gives data subjects the right to withdraw consent at any time and “it shall be as easy to withdraw consent as to give it.” Controllers must inform data subjects of the right to withdraw before consent is given. Once consent is withdrawn, data subjects have the right to have their personal data erased and no longer used for processing.

Second, the GDPR adds a presumption that consent is not freely given if there is “a clear imbalance between the data subject and the controller, in particular where the controller is a public authority.” Importantly, a controller may not make a service conditional upon consent, unless the processing is necessary for the service.

Third, the GDPR adds that consent must be specific to each data processing operation. To meet the specificity requirement under Article 7, a request for consent to data processing must be “clearly distinguishable” from any other matters in a written document, and it must be provided “in an intelligible and easily accessible form, using clear and plain language.” However, the law exempts controllers from obtaining consent for subsequent processing operations if the operations are “compatible.” Recital 50 states that compatibility is determined by looking at factors including the link between the processing purposes, the reasonable expectations of the data subject, the nature and consequences of further processing, and the existence of appropriate safeguards for the data.

Extra protections have been added to safe guard personal data of children, by limiting the ability to consent for data without parental authorization. The age of consent for children has been raised from 13 years old to 16 years old, but member states can lower the age not below 13 years, but must then require parental or guardian consent.

https://www.privacy-regulation.eu/en/7.htm

 

Cross-border data transfers
Chapter V (Articles 44 through 49) of the GDPR governs cross-border transfers of personal data. Article 45 states the conditions for transfers with an adequacy decision; Article 46 sets forth the conditions for transfers by way of appropriate safeguards in the absence of an adequacy decision; Article 47 sets the conditions for transfers by way of binding corporate rules; Article 48 addresses situations in which a foreign tribunal or administrative body has ordered transfer not otherwise permitted by the GDPR; and Article 49 states the conditions for derogations for specific situations in the absence of an adequacy decision or appropriate safeguards.

The GDPR provides mechanisms for cross-border data transfers in the absence of an adequacy designation if the controller or processor utilizes certain safeguards. Under Article 49, appropriate safeguards include:

  • Legally binding and enforceable instrument between public authorities or bodies.\
  • Binding corporate rules in accordance with article 47.
  • Standard data protection contractual clauses adopted by the Commission in accordance with the examination procedure referred to in Article 93(2).
  • Standard data protection contractual clauses adopted by a supervisory authority and approved by the Commission pursuant to the examination procedure referred to in Article 93(2).
  • An approved code of conduct pursuant to Article 40 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights.
  • An approved certification mechanism pursuant to Article 42 together with binding and enforceable commitments of the controller or processor in the third country to apply the appropriate safeguards, including as regards data subjects’ rights.


Controllers must provide certain information to data subjects when their information is obtained. This explicitly includes (a) that the controller intends to transfer personal data to a third country or international organization; and (b) that such transfer is pursuant to an adequacy decision by the Commission; or (c) reference to the appropriate or suitable safeguards and the means for the data subject to obtain them. Such information must be provided in a concise, transparent, intelligible and easily accessible form, using clear and plain language, and as otherwise required by Article 12.

Perhaps the biggest change here from the original directors is that failure to comply with the GDPR international data transfer provision may results in hefty fines. Violations of the data transfer provisions in Articles 44-49 are subject to the steeper of the two administrative fine provisions in the GDPR. Such violations may result in “administrative fines up to 20,000,000 EUR, or in the case of an undertaking, up to 4 percent of the total worldwide annual turnover of the preceding financial year, whichever is higher.” The factors considered for imposing a fine include “the nature, gravity and duration of the infringement, the intentional character of the infringement, actions taken to mitigate the damage suffered, degree of responsibility or any relevant previous infringements, the manner in which the infringement became known to the supervisory authority, compliance with measures ordered against the controller or processor, adherence to a code of conduct and any other aggravating or mitigating factor.”

https://www.privacy-regulation.eu/en/44.htm
https://www.privacy-regulation.eu/en/45.htm
https://www.privacy-regulation.eu/en/46.htm
https://www.privacy-regulation.eu/en/47.htm
https://www.privacy-regulation.eu/en/48.htm
https://www.privacy-regulation.eu/en/49.htm

 

Data Profiling
The current Directive was implemented nearly 20 years ago, technologies have proliferated that allow data controllers to gather personal data and analyze it for a variety of purposes, including drawing conclusions about data subjects and potentially taking action in response to those conclusions such as target marketing, price differentiation, and the like. Although the concepts of “profiling” or “target marketing” appear in the Directive, the precise terms do not. Under Article 4(4), data processing may be characterized as “profiling” when it involves (a) automated processing of personal data; and (b) using that personal data to evaluate certain personal aspects relating to a natural person. Specific examples include analyzing or predicting “aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location or movements.”This definition implicitly excludes data processing that is not “automated.”

The GDPR, defines that “profiling” requires some sort of an outcome or action resulting from the data processing is underscored by the data subject’s rights to be informed of the “consequences” of profiling decisions as discussed in Recitals 60 and 63. Articles 13 and 15, which address information to be provided a data subject upon personal data collection and upon the data subject’s request, both require disclosure of “the existence of automated decision making including profiling” along with “the significance and the envisaged consequences of such processing for the data subject.”

Data subjects are given the right to object to processing for direct marketing as well as to “profiling to the extent it is related to direct marketing,” further underscoring that profiling is not direct marketing per se but instead is something more. Recital 91 describes the obligation to conduct a data impact assessment and characterizes the “profiling of data” as follows: “A data protection impact assessment should also be made where personal data are processed for taking decisions regarding specific natural persons following any systematic and extensive evaluation of personal aspects relating to natural persons based on profiling those data.”

Even when profiling is otherwise lawful, a data subject has the right to object at any time. Pursuant to Article 19, upon the data subject’s objection to profiling that is otherwise authorized under Article 6, the processing must cease unless the controller demonstrates “compelling legitimate grounds for the processing which override the interests, rights and freedoms of the data subject.”

When processing is for direct marketing purposes, including profiling, the data subject similarly has a right to object but in this case processing must cease and the controller is not authorized to continue under any circumstances.

https://www.privacy-regulation.eu/en/4.htm
https://www.privacy-regulation.eu/en/19.htm

 

Data portability
The GDPR introduces two new rights for data portability. First, the regulation codifies a right to be forgotten. This right allows individuals to request the deletion of personal data, and, where the controller has publicized the data, to require other controllers to also comply with the request. Second, the right to data portability requires controllers to provide personal data to the data subject in a commonly used format and to transfer that data to another controller if the data subject so requests.

The GDPR also augments the existing rights of data subjects to receive notice about processing activities, gain access to the information that is being processed, and to have the controller rectify inaccuracies. The data subject’s right to object to processing is broader than under the Directive, moreover, allowing her to object to processing at any time, unless the controller has compelling legitimate grounds.

One of the new rules is directly related to the term “Big Data“. The creation of a new right to data portability that aims to increase user choice of online services. Where controllers process personal data through “automated means,” Article 20 grants data subjects the right to receive the personal data concerning them. Controllers must provide the data in a commonly used and “machine-readable” format, and data subjects have the right to transmit that data to any other controller. Where feasible, the controller may even be required to transmit the data directly to a competitor.

The GDPR increases the number of disclosures a controller must make before collecting personal data. In addition to the identity of the controller, the purposes for processing, and any recipients of personal data, Article 13 requires controllers to disclose how long the data will be stored. Controllers also must inform data subjects of the right to withdraw consent at any time, the right to request access, rectification or restriction of processing, and the right to lodge a complaint with a supervisory authority.

https://www.privacy-regulation.eu/en/17.htm
https://www.privacy-regulation.eu/en/20.htm

 

Vendor management
The GDPR defines a controller as “the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data.” The controller is the entity that makes decisions about processing activities, regardless of whether it actually carries out any processing operations.

Article 24 makes controllers responsible for ensuring that any processing activities are performed in compliance with the Regulation. Controllers must “implement appropriate technical and organizational measures” not only not only to ensure compliance, but also to be able to demonstrate the measures that they have in place. Controllers are liable for the actions of the processors they select and responsible for compliance with the GDPR’s personal data processing principles. Under the GDPR, the term “processor” means a “natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.”

In the event of a personal data breach, processors are required to notify the controller without “undue delay” if it happens on the processor’s watch. The burden falls on the controller, then, to notify the supervisory authority within 72 hours of becoming aware of the breach. If notification is not made within 72 hours, controllers are required to provide a reasoned justification for the delay. Controllers are also responsible for documenting personal data breaches, including the facts of the breach, its effects, and remedial actions.

https://www.privacy-regulation.eu/en/24.htm

 

Pseudonymizing of personal data
The GDPR introduces a new concept in European data protection law – “pseudonymization” – for a process rendering data neither anonymous nor directly identifying. Pseudonymization is the separation of data from direct identifiers so that linkage to an identity is not possible without additional information that is held separately. Pseudonymization, therefore, may significantly reduce the risks associated with data processing, while also maintaining the data’s utility. For this reason, the GDPR creates incentives for controllers to pseudonymize the data that they collect. Although pseudonymous data is not exempt from the Regulation altogether, the GDPR relaxes several requirements on controllers that use the technique.

  • The Regulation recognizes the ability of pseudonymization to help protect the rights of individuals while also enabling data utility.
  • Pseudonymization may facilitate processing personal data beyond original collection purposes.
  • Pseudonymization is an important safeguard for processing personal data for scientific, historical and statistical purposes.
  • Pseudonymization is a central feature of “data protection by design.“
  • Controllers can use pseudonymization to help meet the GDPR’s data security requirements.
  • Controllers do not need to provide data subjects with access, rectification, erasure or data portability if they can no longer identify a data subject.
  • The GDPR encourages controllers to adopt codes of conduct that promote pseudonymization.


https://www.privacy-regulation.eu/en/6.htm
https://www.privacy-regulation.eu/en/89.htm

 

Codes of conduct and certifications
Articles 40 and 41 are the primary sources of authority for establishing approved codes of conduct to serve as compliance-signaling tools for controllers and processors.

https://www.privacy-regulation.eu/en/40.htm
https://www.privacy-regulation.eu/en/41.htm

The GDPR’s adoption of codes of conduct and certification mechanisms is a welcome development for controllers and processors seeking efficient means for compliance. There are of course upfront administrative burdens of establishing and maintaining compliance with a code of conduct or earning certification status. But these costs are offset by the ease of finding compliant processors, for example, via screening for those adhering to a code or displaying a certification seal. The codes and certifications also may serve as marketing tools, allowing data subjects to choose controllers signaling GDRP compliance via their membership in associations or their certified status. They also will likely play a significant role in facilitating cross-border data transfers.

Violations
The GDPR empowers supervisory authorities to assess fines that are “effective, proportionate and dissuasive.” It sets forth both mitigating and aggravating factors to help DPAs assess the amount of a fine. For example, intentional violations are worse than negligent ones. Mitigating factors include adherence to a code of conduct or certification mechanisms, minimizing the use of sensitive categories of data, and employing appropriate technical and organizational safeguards. In the event of non-compliance, moreover, controllers or processors may limit the amount of a fine by mitigating “the damaging nature, gravity and duration of the violation,” reporting the violation as soon as possible and cooperating with the supervisory authority. Aggravating factors generally include the opposite actions – not seeking to mitigate harm or acting contrary to the mitigating factors.

The GDPR creates two tiers of maximum fines depending on whether the controller or processor committed any previous violations and the nature of violation. The higher fine threshold is four percent of an undertaking’s worldwide annual turnover or 20 million euros, whichever is higher. The lower fine threshold fine is two percent of an undertaking’s worldwide annual turnover or 10 million euros, whichever is higher.

The Regulation empowers data subjects to seek judicial relief for damages and file administrative complaints with supervisory authorities. The Regulation’s guidance on imposing fines replaces the patchwork enforcement structure of the Directive, while establishing accountability and consistency mechanisms also lacking under the Directive. The hefty fines and penalties for infringement not only encourage accountability, they may be the single most eye-catching feature of the Regulation, causing multinationals and local companies to invest more in compliance. The GDPR’s consistency mechanisms – encouraging supervisory authorities to cooperate and agree on infringement decisions, empowering the Board for dispute resolution, making final decisions binding – will ease burdens on controllers and processors doing business across Member State states by offering more efficient enforcement solutions.

https://www.privacy-regulation.eu/en/55.htm
https://www.privacy-regulation.eu/en77.htm
https://www.privacy-regulation.eu/en/78.htm
https://www.privacy-regulation.eu/en/79.htm
https://www.privacy-regulation.eu/en/80.htm
https://www.privacy-regulation.eu/en/81.htm
https://www.privacy-regulation.eu/en/82.htm
https://www.privacy-regulation.eu/en/83.htm
https://www.privacy-regulation.eu/en/84.htm


To learn more about Office 365 GDPR, download Protiviti's Office 365 GDPR smartsheet.

General; Governance; Office 365; SecurityLiam Cleary, SharePoint MVP
  
8/15/2017 10:15 AM


Brief Introduction to Microsoft Teams

In late 2016 Microsoft announced Teams, a new collaborative workspace intended to centralize people, their conversations, their content and tools, into the Office Suite. As with other Microsoft Applications, Teams is available with both a desktop and a web version in Office 365. One advantage I’ve noticed through using the tool is that the web application seems to replicate all of the desktop aplication’s features/functionality, unlike many other Microsoft web applications that small short on feature offerings.

As a relatively new tool within the Microsoft stack, there should be careful consideration for an organization to determine if Teams should be made available, who should use Teams, how they should be leveraged, and how they will be managed. If an organization determines that Teams would be a beneficial tool to deploy to users, there are a few things to consider about the use of Teams that I will outline below. 

 
The Terminology & Associated Features
With so many new tools and features being released by Microsoft, it can be difficult to keep the terminology on everything straight. Therefore, a few terms I’ll be using a lot in this blog are defined below along with an explanation on their relationship to one another.
•  SharePoint Team Site – as many of your know, SharePoint Team Sites provide users with a collaborative, highly-customizable/flexible workspace that allows for content management, process automation, page creation, etc. on top of the SharePoint platform within an organization. A SharePoint Team site will enable access to Lists, Libraries, a Shared OneNote Notebook, a shared mailbox, an Outlook Calendar, and more.
•  Groups – a new Office 365 feature that allow you to easily set up collections of users. As a new take on distribution lists and shared mailboxes, this feature allows a group manager to select the users to join and in-turn are deployed a Shared Outlook Inbox, Shared Outlook Calendar, a SharePoint Team Site (site, document library, OneNote, etc.), and more.
•  Teams – a new Microsoft collaborative workspace application, not to be confused with the general sense of the term teams used to describe functional groups of individuals. Creation of a team deploys the conversation and content-centric workspace with various “Channels”, which enable a Team Wiki, Team Threaded Chat, as well as, all of the features offered with a Group or SharePoint Team Site.
 
Refer to the basic diagram below which depicts the relationship between these tools and features.



The Teams Interface

With a better understanding of some of these Microsoft tools/features and their relationship to one another, we can take a closer look at Microsoft Teams. The Teams experience will be new to many, so I wanted to give a very brief, high-level run through of the application and it’s interface.

General Interface


 
The Teams application utilizes a simple user interface that rolls up an Activity Feed from all of the user’s Teams, a chat functionality, a list of the user’s Teams, and Team Files on the left-hand navigation. All of the user’s Teams are listed with various “Channels” or topic areas underneath. There is the ability to search across a user’s Teams for chat conversations, people, and files. Each Team/Channel has a basic working space that offers an easily-configurable, tabbed navigation where Team members can add links to specific Team files, Microsoft Planner, other applications and connectors, and….. the Team’s OneNote Notebook.

Now that we’ve outlined some of these new features and covered the basic layout of the Teams application, we can start to dive into the use of OneNote within your Teams.
 

OneNote in Microsoft Teams

I’m sure that almost everyone reading this knows what Microsoft OneNote is and have probably used it for taking notes. The tool provides a great platform for personal note taking, however, by utilizing it in Microsoft Teams and maximizing its features, OneNote can add much more value to you and your functional groups. As depicted in the feature diagram above, once a user creates a Team, a single OneNote Notebook is provisioned through the corresponding SharePoint Team Site. 

This is advantageous for a team to utilize because groups can manage findings, commentary, and other annotations in a single location that can leverage standardized templates, can be edited by multiple users simultaneously, and searched. This shared Notebook can be accessed through the Teams Application, the Outlook Web Application, the SharePoint Team Site, as well as, the OneNote desktop application.

*Note that the OneNote desktop application has more features/functionality available than the OneNote instance in the Teams application, but the latter still works well for basic note taking.
 
Once you’ve created an Office 365 Team, you have an associated OneNote Notebook. From the Teams application, select the + symbol and select “OneNote” to make the Notebook accessible to the Team members. The OneNote Notebook will appear as a tab within the Teams application for everyone to access and you can even post the update to the Team thread conversation to update other members.


 
 
Once the Notebook is accessible from the Team’s tabs, you will be able to navigate its pages/sections and enable team note taking within the application, right alongside the threaded conversations, chats, and other shared files. OneNote within the Teams Application does not offer all of the same features as the desktop application, but you can always elect to open within the OneNote desktop application.

In the following section, I’ve outlined some of the key features available within the Teams Application, as well as, a few of my favorite features that I feel users often don’t utilize to their full advantage.

OneNote Features & Functionality

Within the Teams application, users will notice many familiar features and options that are available within OneNote itself. Below, I’ve outlined some of the key features available from the OneNote ribbons:

•  Home – Core Functions such as Undo / Redo; Cut / Copy / Paste / Format Painter; Basic Text Features; Styling; Tag Creation; and Spell Check
•  Insert – Inserting New Pages / New Sections; Tables; Pictures / Attachments / Links; and Recording Audio
•  Draw – Ability to Draw / Sketch; Control over Drawing Styles and Colors; Highlighting Notes
•  Table Layouts – Ability to Select / Delete / Insert / Shade / Etc. Rows, Columns, and Entire Tables
•  Print / Open in Desktop – Users can also elect to Print their notes or Open them on their Desktop Application for full, native functionality
Outside of the standard features and options that OneNote in Teams offers, I’ve found that there are a number of tools within OneNote’s Desktop Application that users should leverage to maximize the value of their personal notes, their team notes, and the tool as a whole. I’ve highlighted a few of my personal favorites below:

•  OneNote Templates – Everyone has their own style, structure, and process around taking and organizing their notes, which can leave team members scavenging through OneNote Pages trying to find key insights and information in unfamiliar formats. OneNote has the capability to leverage standardized templates to promote consistency around all of those things.
 
In order to take advantage of OneNote Templates, a user can create a structured format within an OneNote Page. From the Page Templates pane under the Insert tab, you’ll notice that Microsoft provides many out of the box templates for users to leverage, as well as, the ability to see your personal Templates and the ability to save you current page as a Template. By leveraging Templates, individuals and Teams can standardize note taking formats, styling, and structure for a more clearly-organized experience.


•  Password Protecting OneNote Sections – Shared OneNote Notebooks are a great feature within Microsoft Teams that allow collaborative note taking functionality in a single location. Often times within a team of users, there is some information that shouldn’t be openly shared with everyone involved. Password protecting OneNote Sections allows users to secure certain information that shouldn’t be shared with everyone.
 
In order to password protect an OneNote Section, simply right-click on the Section Tab in the desktop application and select “Password Protect this Section…”. From the Password pane, set a password for the Section and now you can have secured notes within a Team Notebook.
*Sections must be unlocked to search within them.


•  OneNote Tags – OneNote offers the ability to create Tags which allow you to creatively mark key information within your notes. Microsoft offers pre-built Tags that can be leveraged to denote To-Do Lists, Definitions, Contact Information, Note Criticality, etc. From the Home Ribbon, within the Tags section, you can see the extensive list of OneNote Tags. You can also create Microsoft Outlook Tasks within OneNote that will appear in your Outlook Mailbox.
 
OneNote Tags are also fully-customizable to allow you to create, modify, and maintain your own Tags to leverage both personally and within your Teams. Tags are a great way to effectively organize your notes, follow-ups, to-do tasks, and much more.


 
•  OneNote Search & Finding Tags – Do you ever find yourself looking through all of your notes, not remembering where you left key information or details? OneNote has strong search capabilities that allow you to query through your notes to find exactly what you’re looking for. More importantly, you can scope your searches to:
     • A single Page
     • A single Section
     • Across multiple Sections
     • Across an entire Notebook
       or
     • Search through all of your Notebooks
This scoping capability is extremely useful when you’re leveraging OneNote for multiple projects, initiatives, and teams.
 
OneNote Tags are also searchable, allowing you to locate your to-do’s, contact information, ideas, and other key insights across the tool. You can query Tags across Pages, Sections, and all Notebooks and are also given the ability to create a Summary Page that will aggregate all of your Tags, by type, onto a single Page.


 
 
I hope that this helps to outline how Teams, Groups, SharePoint and OneNote all work together to create a simple, collaborative workspace for you and your users. Also, I hope that this provided you with some great ideas and insights on how to leverage OneNote within Teams, as well as, some simple OneNote features that can make your life a lot easier. Happy note taking everyone!
 

General; Office 365; Project Management; Tips & TricksMichael Essling
1 - 30Next

© Protiviti 2019. All rights reserved.   |   Privacy Policy