SharePoint Important Security Update April 2015

April 21, 2015 Leave a comment

Microsoft released few days ago the Security bulletin MS15-36 rated as « Important ». This bulletin mention several breaches on Sharepoint 2013 and SharePoint 2010 that could lead to an elevation of privilege. Even if, so far, this breach seems to be not exploited, I recommend to patch your systems with the following updates :

 

Advertisements

Office 365 : Now you can log in With Username OR MailAddress…


…Or whatever you want ! It ‘s a great news that has been pretty discreelty relayed, but now, no matter the way you authenticate on Office 365 (Dir Sync, FedAuth or Cloud Only) you can now log in with any user info Profile.

Previously, in most of the case, you were only able to authenticate user with their UPN.

It’s a real improvement and the end of a limitation that sometime dissuade organization with a complex Active Directory Topology to move to the Cloud.

Here are all the info to enjoy this great new capability :

 

Categories: Office 365

Bulk change recovery model on all databases

August 12, 2013 2 comments

I don’t know where I found this script, but I’m using it every day (well, weeks may be more accurate) on Test environments to change all the Database to simple recovery model and save lot of disk space.

use [master]
go
-- Declare container variabels for each column we select in the cursor
declare @databaseName nvarchar(128)
 -- Define the cursor name
declare databaseCursor cursor
 -- Define the dataset to loop
for 
select [name] from sys.databases where name not in ('Master','tempdb','model','msdb')
 -- Start loop
open databaseCursor
 -- Get information from the first row
fetch next from databaseCursor into @databaseName
--  Loop until there are no more rows
while @@fetch_status = 0
begin
  print 'Setting recovery model to Simple for database [' + @databaseName + ']'
  exec('alter database [' + @databaseName + '] set recovery Simple')
  print 'Shrinking logfile for database [' + @databaseName + ']'
  exec(' use [' + @databaseName + '];' +'  
     declare @logfileName nvarchar(128);
    set @logfileName = (
        select top 1 [name] from sys.database_files where [type] = 1
    );
    dbcc shrinkfile(@logfileName,1);
    ')  
    -- Get information from next row
    fetch next from databaseCursor into @databaseName 
end
 --  End loop and clean up
close databaseCursor
deallocate databaseCursor 
go

All you have to do, is to copy/paste this script in a SQL Server Management Studio SQL Query window. It works on numerous SQL version (I used it on SQL 2005 ->2012) no matter the system language…

Categories: SQL Server Tags:

Using ContentTypeId in CAML Query

June 28, 2013 Leave a comment

You can specify a Content Type Id in a CAML Query using the <FieldRef Name=’ContentTypeId’ /> clause. But to make this query works, you have to specify the right Value Type. Using “Text” or “Computed” mays works on some environments but fails on others…

It seems that the best choice is to use the “ContentTypeId” value type. This value type works pretty well with the Equals, or BeginsWith Condition.

Example:

           using (SPSite site = new SPSite(SiteURL))
            {
                using (SPWeb web = site.OpenWeb())
                {
                    SPList list = web.Lists[ListName];
                    SPQuery qry = new SPQuery();
                    qry.Query=String.Format(@"
                       <Where>
                            <And>
                                <Eq>
                                    <FieldRef Name='Title'/>
                                    <Value Type='Text'>{0}</Value>
                                </Eq>
                                <BeginsWith>
                                    <FieldRef Name='ContentTypeId'/>
                                    <Value Type='ContentTypeId'>{1}</Value>
                                </BeginsWith>
                            </And>  
                        </Where>",myDocumentTitle,myContentTypeID);
                    var myItem=list.GetItems(qry);

This article mention main value types, but the ContentTypeId is missing there.

Thanks Pierre

Exception Object reference while transferring Document Set to a Drop Off Library


SharePoint 2010 offers a feature that allows document transfers between 2 SharePoint Sites.

If using this feature is pretty straightforward for plain documents, transferring a document Set can raise an error. SharePoint ULS mention a NullReferenceException. :

OfficialFile: File http://spSite/DocLib/MyDocSet.zip was not saved at router location /sites/Destination/DestinationDropOffLibrary. Exception Object reference not set to an instance of an object. thrown at: at Microsoft.Office.RecordsManagement.RecordsRepository.OfficialFileCore.SaveFileToFinalLocation…

Site http://spSite/sites/ Destination/_vti_bin/OfficialFile.asmx is not properly configured for this request.

A quick resolution for this issue is to create at least one content organizer rule that match Document Set content type. This rule event don’t have to be active :

With this rules enabled, sending a document Set to a Drop Off Library finally works.

Categories: SharePoint 2010

Records stops in logging database, then came back the day after

January 21, 2013 3 comments

Sharepoint has a great feature, the “Usage Data Collection“. This feature allows to write in database (Wss_Usage or Wss_Logging or whatever the name you set). Every single actions that happen on your Sharepoint Farm, from user’s visits, to the duration of timer job execution including Windows performance monitor values can be stored into this database. But sometimes, record stops for a day, and came back the day after.

From documentation, a job timer (named Microsoft SharePoint Foundation Usage Data Import) is in charge to write into a specific table of this logging database. To do this, a data table is provided by Day of month (up to 32) AND by recorded event type:

All these event are summarize in a SQL view dedicated to the event type:

SharePoint writes all these data using a stored procedure dbo.prc_Insert<UsageDefinition>.

Digging further we can see that when writing data, the size of the current partition (data table that store data for the day) is checked. It must be lower than the maximal storage limit (in bytes) divided by the number of retention days set for the event selection (Check line 31-32 of the procedure). This 2 parameters are extracts from the dbo.Configuration data table by the dbo.fn_GetConfigValue
function (In Functions/Scalar-valued Functions) of this same Logging database (not the Farm configuration).


So, to raise the daily limit we can:

  • Lower the number of retention days.
  • Raise the maximale total bytes limit.

Of course, we won’t work directly on the database, PowerShell allows to specify the retention period. For instance, for Page Requests Usage definition:

Set-SPUsageDefinition -Identity "Page Requests" -DaysRetained 21
 

To update the total limit we have to be a bit smarter, the default size is 6000000000 bytes, about 5.6GB, let’s update is to 10GB =>10*1024*1024*10214=>10737418240

$definition= get-SPUsageDefinition -Identity "Page Requests" 
$definition.MaxTotalSizeInBytes=10737418240 
$definition.Update()
 

Configuration data table is not update immediatly. In fact, I could not find how the parameter is save in database, I could not find any relevant timer job (any info about that is welcome), but anyway, the day after, retention period is fairly saved 🙂 but not the max limit size:(

So, to update this parameter, we don’t have any other choice but updating the database :

update WSS_Logging.dbo.Configuration set ConfigValue=‘10737418240’
where ConfigName=‘Max Total Bytes – RequestUsage’

where RequestUsage is the event type you want to extend …

Now, our limit is raised:

Doing this database update can introduce questions about editor’s support. But from documentation, this logging database has a specific support status:

  • « Moreover, the logging database is the only SharePoint Server 2010 database for which you can customize reports by directly modifying the database.” from article Understanding the Logging Database
  • “The Usage and Health Data Collection database is the only SharePoint database that supports schema modifications” from article Database types and descriptions.

So I think that this operation preserve Microsoft support, but don’t tell them I told you!!

Categories: SharePoint 2010

KB2756920 Crashes SharePoint

January 14, 2013 4 comments

Bonjour

We notice on several SharePoint Farms powered by Windows 2008 R2-pre SP1, that service securitytoken.svc failed on compilation since deploying KB2756920 .

Look for this error in SharePoint Logs :

Exception: System.ServiceModel.ServiceActivationException: The service ‘http://<server>/SecurityTokenServiceApplication/securitytoken.svc&#8217; cannot be activated due to an exception during compilation

Removing this KB or deploying Windows 2008R2SP1 seems to correct this issue.

Some articles mention a dependency to claims authentication, but we met this exception on basic-authentication web app…

More info: Here

Categories: SharePoint 2010