How Prescriptive Guidance Helps Win Competitive Assessments



Why invest in prescriptive guidance for platform impact?

While the answer is obvious to many, it’s not as obvious to others, so I’ll attempt to paint the picture here.

One example is Building Secure ASP.NET Applications.  It was the first “blue book” at Microsoft.

But it was Improving Web Application Security that really made people take notice.  It was downloaded more than 800,000 times in its first six months and it changed how many people in the industry thought about security and it changed their approach.

It’s also the guide that helped many customers switch from Java to .NET.

An interesting side note about Building Secure is that the Forms Authentication approach was baked into the Whidbey platform (ASP.NET 2.0.)

Prescriptive Guidance Shapes Platform Success

Prescriptive guidance played a strategic role in both shaping the platform and driving exponential customer success on the platform.

They’ve helped the world at large find and share platform best practices, create mental models and conceptual frameworks, and create systems and approaches that scale success and create powerful ecosystems.

They’ve also helped spring up offerings for the field, reduce support costs, and win competitive assessments.

Ultimately, prescriptive guidance gives a strategic look at platform pain points as well as competitive analysis, and a consolidated set of success patterns to run with.

From patents to methodologies to better ways for better days, prescriptive guidance has been the definitive way for improving platform success in a sustainable way – a durable backdrop that provides continuity of the platform over time.

Benefits at a Glance

Here is a quick rundown of some of the key ways that prescriptive guidance helps customers and platform builders win time and again:

  • Platform Playbooks – Serve as platform playbooks for the company field, support, customers, and partners
  • Shaping the Platform and Tools – Shape the platform and tools by testing out patterns and practices as well as methodologies and methods with the broad community before baking into the platform and tools.
  • Scaling Success Patterns – Broadly scale proven practices and success patterns for predictable results
  • Roadmaps for Platform Adoption – Lay out roadmaps for technology adoption as well as success patterns
  • Competitive Wins – Win competitive assessments (the prescriptive guides have played a critical role in influencing industry analysts and in winning competitive assessments time and again)
  • Innovation for Exponential Success – Innovate in methodologies and methods for exponentially improving customer success on the platform
  • Frame and Name the Problem Domains – Frame out and name the problem spaces and domains (when you frame out and name a space, whether through patterns or pattern languages, you create a shared vocabulary and model that empowers people to make forward progress at a faster pace and more deliberate way.)

The list goes on, but the essence is that these playbooks help customers make the most of the platform by sharing the know-how through prescriptive architectural guidance.

Example Prescriptive Guidance

I won’t speak for all the prescriptive guidance, but since I created the bulk of the Blue Books on the original Microsoft patterns & practices team, it’s easy for me to speak from the ones I created.

Here is a summary of the impact that can help you better understand the value of Blue Books from a broader perspective.

Application Architecture Guide, Second Edition

  • The platform playbook for Microsoft’s application platform
  • Canonical application types for Web app, RIA, Rich Client, Mobile, and Web Services
  • Baseline best practices for application architecture and design
  • Templates baked into Visual Studio
  • Praise from Ray Ozzie
  • Praise from Grady Booch
  • Conceptual Framework for Application Architecture

Building Secure ASP.NET Applications
(aka The first official Microsoft “Blue Book”)

  • End-to-End Application Scenarios for Web Apps
  • Created a highly reusable set of Application Patterns
  • Baseline architectures and success patterns shared broadly inside and outside Microsoft

Improving .NET Application Performance and Scalability
(aka “Perf and Scale”)

  • Repeatable performance model
  • Created a highly-effective method for performance modeling
  • Performance Engineering approach baked into Visual Studio
  • 4 patents filed for performance engineering
  • Performance Engineering approach widely adopted inside and outside Microsoft
  • Used for offerings in Microsoft Consulting Services
  • Rules baked into Microsoft Best Practices Analyzer Wizard (MBPA)

Improving Web Application Security
(aka “Threats and Countermeasures”)

  • Repeatable security model for Web applications
  • Created a highly-effective method for threat modeling
  • Created a knowledge base of threats, attacks, vulnerabilities, and countermeasures
  • Security model for network, host, and application security
  • Security Engineering approach baked into Visual Studio
  • 4 patents filed for application security
  • Used for offering in Microsoft Consulting Services
  • Rules baked into Microsoft Best Practices Analyzer Wizard (MBPA)

Improving Web Services Security

  • Security model for Web Services
  • End-to-End Application Scenarios for Web Services
  • Created a highly reusable set of Application Patterns
  • Baseline architectures and common success patterns shared broadly inside and outside Microsoft

Performance Testing Guidance for Web Applications

  • Created a highly-effective method for performance testing Web applications
  • Performance Testing approach widely adopted inside and outside Microsoft
  • Used for offerings in Microsoft Consulting Services

Security Engineering Explained

  • Created a model for baking security into the life cycle
  • Helped shift thinking from security “reviews” to “inspections”
  • Overlays security-specific activities on product development life cycles

Team Development and Visual Studio Team Foundation Server

  • Created a glide-path for TFS adoption (source control, build, task tracking / reporting, process)

End-to-End Application Scenarios and Solutions

Here’s an example of an application scenario.

We use application scenarios to show how to solve end-to-end problems.  It’s effectively a baseline architecture based on successful solutions.

Here is an example from the original WCF Security Guide:



We share them as sketches like on a whiteboard so they are easy to follow.

Methodologies and Methods

Methodologies, frameworks and approaches are nice ways to wrap up and package a set of related activities that you can use a baseline for your process or to overlay on what you already do.

Methods are step-by-step techniques for producing effective results and they are a powerful way to share expertise.

Methodologies and methods are how we create exponential results and amplify our impact.

Example Methodology – Agile Security Engineering


Example Method – Threat Modeling Technique


Conceptual Frameworks and Mental Models

We use mental models, conceptual frameworks, and information models to learn and share the problem space.

Example Conceptual Framework for Web Security


Example Mental Model for Application Architecture


Hot Spots

Hot Spots are basically heat maps of pain points and opportunities.  We use them as a lens to help us see customer pain points and opportunities, and to prioritize our investments.  They also help us identify, organize, and share scenarios.  Hot Spots also help us organize and share principles, patterns, practices, and anti-patterns for key engineering decisions.   Hot Spots are a powerful tool for product planning and for building prescriptive guidance, platform, and tools.

Example of Security Hot Spots


Example of Architecture Hot Spots


Scenarios Organized by Architecture Hot Spots


Competitive Wins

I like competitive studies.  I’m usually more interested in the methodology than the outcome.  The methodology acts as a blueprint for what’s important in a particular problem space.

Prescriptive guidance has consistently been used for winning competitive assessments or at least making significant impact in key areas.

Whether there’s a gap in the tools or a gap in the platform, prescriptive guidance can smooth it out by creating a success path for customers.

Security Innovation Security Engineering Study

The Security Innovation Security Engineering study compared security in the Application Lifecycle – Microsoft and IBM Development Platforms Compared.

My favorite quote in the study is “The patterns & practices security guidance covers the key security engineering activities better than any other resource we’ve found.”

I think this reflects the fact we have more than 2,500 pages of security guidance, and integrated out guidance into the tools that developers use.

I’ve summarized the study here for quick reference:

Security Innovation evaluated the guidance and tools of Microsoft’s and IBM’s development platforms.  The study compared the support available to a development team via security guidance, documentation and security focused features in the life-cycle tool suites.  Gartner reviewed the approach.

Evaluation Criteria

  • CoverageHow well do the provided tools and guidance cover the key set of security areas?
  • QualityHow effective and accurate are the tools and guidance?
  • VisibilityHow easy is it to find the tools and guidance and then apply it to your security needs?
  • UsabilityAre the tools and guidance precise, comprehensive and easy to use?


  • Outstanding: 81-100%
  • Good: 61-80%
  • Average: 41-60%
  • Below Average: 21-40%
  • Poor: 0-20%

Scorecard Categories

  • Basic Platform Security.  When used in accordance with its documentation, a platform should be inherently secure.
  • Platform Security Services.  A mature platform should include services that make it easier for developers to implement security features in their applications.
  • Platform Security Guidance. A secure platform is much less useful if it lacks proper guidance.
  • Software Security Engineering Guidance.  It is not possible to develop a secure application unless security is a focus during every phase of the development lifecycle.
  • Security Tools.  A secure platform should include tools that make it easier to define, design, implement, test, and deploy a secure application.

Results of the Study

First, here’s a couple key points, then the summaries are below:

  • Microsoft beat IBM in every category around guidance.
  • Microsoft beat IBM in three out of four categories around tools.


  1. Platform Overall
    1. Overall: 36%
    2. Coverage: 62%
    3. Quality: 70%
    4. Visibility: 17%
    5. Usability: 72%
  2. Platform Security Guidance
    1. Overall: 50%
    2. Coverage: 81%
    3. Quality: 85%
    4. Visibility: 17%
    5. Usability: 84%
  3. Security Engineering Guidance
    1. Overall: 25%
    2. Coverage: 50%
    3. Quality: 64%
    4. Visibility: 17%
    5. Usability: 69%
  4. Security Tools
    1. Overall: 32%
    2. Coverage: 55%
    3. Quality: 59%
    4. Visibility: 56%
    5. Usability: 63%


  1. Platform Overall
    1. Overall: 67%
    2. Coverage: 88%
    3. Quality: 85%
    4. Visibility: 61%
    5. Usability: 80%
  2. Platform Security Guidance
    1. Overall: 76%
    2. Coverage: 93%
    3. Quality: 85%
    4. Visibility: 67%
    5. Usability: 91%
  3. Security Engineering Guidance
    1. Overall: 78%
    2. Coverage: 100%
    3. Quality: 89%
    4. Visibility: 67%
    5. Usability: 79%
  4. Security Tools
    1. Overall: 47%
    2. Coverage: 71%
    3. Quality: 78%
    4. Visibility: 50%
    5. Usability: 68%

Quotes from the Study

  • Microsoft’s overall rating of 67% reflects the impressive level of focus Microsoft has applied to application security in the past several years.
  • IBM’s overall score of 36% is the result of a more disjointed approach to security.  Security guidance is spread throughout the IBM web site and is difficult to discover.
  • The patterns & practices security guidance covers the key security engineering activities better than any other resource we’ve found.

OpenHack 4 (eWeek Labs): Web Application Security

This was an interesting study because it was effectively an open “hack me with your best shot” competition.

I happened to know the folks on the MS side, like Erik Olson and Girish Chander, that helped secure the application, so it had some of the best available security engineering.

In fact, customers commented that it’s great that Microsoft can secure its applications … but what about its customers?

That comment was inspiration for our Improving Web Application Security:Threats and Countermeasures guide.

I’ve summarize OpenHack 4 here, so it’s easier for me to reference.

Overview of OpenHack 4
In October 2002, eWeek Labs launched its fourth annual OpenHack online security contest.  It was designed to test enterprise security by exposing systems to the real-world rigors of the Web.

Microsoft and Oracle were given a sample Web application by eWeek and were asked to redevelop the application using their respective technologies. Individuals were then invited to attempt to compromise the security of the resulting sites.

Acceptable breaches included of cross-site scripting attacks, dynamic Web page source code disclosure, Web page defacement, posting malicious SQL commands to the databases, and theft of credit card data from the databases used.

Outcome of the Competition
The Web site built by Microsoft engineers using the Microsoft .NET Framework, Microsoft Windows 2000 Advanced Server, Internet Information Services 5.0, and Microsoft SQL Server 2000 successfully withstood over 82,500 attempted attacks to emerge from the eWeek OpenHack 4 competition unscathed.

@Stake Security Study

One of my favorite studies was the original @Stake study comparing .NET 1.1 vs. IBM’s WebSphere security, not just because our body of guidance made a direct and substantial difference in the outcome, but because @Stake used a comprehensive set categories and an evaluation criteria matrix that demonstrated a lot of depth.

Because the information from the original report can be difficult to find and distill, I’m summarizing it below:

Overview of Report
@Stake, Inc., an independent security consulting firm, released results of a Microsoft-commissioned study that found Microsoft’s .Net platform to be superior to IBM’s WebSphere for secure application development and deployment.

@Stake performed an extensive analysis comparing security in the .NET Framework 1.1, running on Windows Server 2003, to IBM WebSphere 5.0, running on both Red Hat Linux Advanced Server 2.1 and a leading commercial distribution of Unix..

Overall, @stake found that:

  • Both platforms provide infrastructure and effective tools for creating and deploying secure applications
  • The .NET Framework 1.1 running on Windows Server 2003 scored slightly better with respect to conformance to security best practices
  • The Microsoft solution scored even higher with respect to the ease with which developers and administrators can implement secure solutions

@stake evaluated the level of effort required for developers and system administrators to create and deploy solutions that implement security best practices, and to reduce or eliminate most common attack surfaces.

Evaluation Criteria

  • Best practice compliance.  For a given analysis topic, to what degree did the platform permit implementation of best practices?
  • Implementation complexity.   How difficult was it for the developer to implement the desired feature?
  • Documentation and examples.  How appropriate was the documentation?
  • Implementor competence.  How skilled did the developer need to be in order to implement the security feature?
  • Time to implement.  How long did it take to implement the desired security feature or behavior?

Ratings for the Evaluation Criteria

  1. Best Practice Compliance Ratings
    1. Not possible
    2. Developer implement
    3. Developer extend
    4. Wizard
    5. Transparent
  2. Implementation Complexity Ratings
    1. Large amount of code
    2. Medium amount of code
    3. Small amount of code
    4. Wizard +
    5. Wizard
  3. Quality of Documentation and Sample Code Ratings
    1. Incorrect or Insecure
    2. Vague or Incomplete
    3. Adequate
    4. Suitable
    5. Best Practice Documentation
  4. Developer/Administrator Competence Ratings
    1. Expert (5+ years of experience
    2. Expert/intermediate (3-5 years of experience)
    3. Intermediate
    4. Intermediate/novice
    5. Novice (0-1 years of experience)
  5. Time to Implement
    1. High (More than 4 hours)
    2. Medium to High (1 to 4 hours)
    3. Medium (16-60 minutes)
    4. Low to Medium  (6-15 minutes )
    5. Low (5 minutes or less )

Scorecard Categories
The scorecard was organized by application, Web server and platform categories.  Each category was divided into smaller categories to test the evaluation criteria (best practice compliance, implementation complexity, quality of documentation, developer competence, and time to implement).

Application Server Categories

  1. Application Logging Services
    1. Exception Management
    2. Logging Privileges
    3. Log Management
  2. Authentication and Access Control
    1. Login Management
    2. Role Based Access Control
    3. Web Server Integration
  3. Communications
    1. Communication Security
    2. Network Accessible Services
  4. Cryptography
    1. Cryptographic Hashing
    2. Encryption Algorithms
    3. Key Generation
    4. Random Number Generation
    5. Secrets Storage
    6. XML Cryptography
  5. Database Access
    1. Database Pool Connection Encryption
    2. Data Query Safety
  6. Data Validation
    1. Common Validators
    2. Data Sanitization
    3. Negative Data Validation
    4. Output Filtering
    5. Positive Data Validation
    6. Type Checking
  7. Information Disclosure
    1. Error Handling
    2. Stack Traces and Debugging
  8. Runtime Container Security
    1. Code Security
    2. Runtime Account Privileges
  9. Web Services
    1. Credentials Mapping
    2. SOAP Router Data Validation

Host and Operating System Categories

  1. IP Stack Hardening
    1. Protocol Settings
  2. Service Minimization
    1. Installed Packages
    2. Network Services

Web Server Categories

  1. Architecture
    1. Security Partitioning
  2. Authentication
    1. Authentication Input Validation
    2. Authentication Methods
    3. Credential Handling
    4. Digital Certificates
    5. External Authentication
    6. Platform Integrated Authentication
  3. Communication Security
    1. Session Encryption
  4. Information Disclosure
    1. Error Messages and Exception Handling
    2. Logging
    3. URL Content Protection
  5. Session Management
    1. Cookie Handling
    2. Session Identifier
    3. Session Lifetime

The Bottom Line on Prescriptive Guidance

The bottom line for me is that prescriptive guidance has helped shape platforms and tools and to create glide-paths for customers through mental models, methodologies, and methods.

They’ve been a powerful way to share success patterns, help paint the bigger picture, and connect the dots across platform, tools, and guidance.

The adoption and usage has accelerated over the years to the point where just about any customer in the application development space that works with the Microsoft platform is familiar with either patterns & practices for the Microsoft Blue Books.

Prescriptive guides have been the freemium offerings that have paved the way for premium experiences.

You Might Also Like

How Prescriptive Guidance Helps Win Customers


Please enter your comment!
Please enter your name here