In-Depth Look at Tuckman’s Ladder and Subsequent Works as a Tool for Managing a Project Team (SANS Reading Room)

Sorry! The paper In-Depth Look at Tuckman’s Ladder and Subsequent Works as a Tool for Managing a Project Team could not be found.

Best Practices

Featuring 91 Papers as of March 1, 2017

  • In-Depth Look at Tuckman’s Ladder and Subsequent Works as a Tool for Managing a Project Team

    STI Graduate Student Research
    by Aron Warren – March 1, 2017 

    Bruce Tuckman’s 1965 research on modeling group development, titled “Developmental Sequence in Small Groups,” laid out a framework consisting of four stages a group will transition between while members interact with each other: forming, storming, norming, and performing. This paper will describe in detail the original Tuckman model as well as derivative research in group development models. Traditional and virtual team environments will both be addressed to assist IT project managers in understanding how a team evolves over time with a goal of achieving a successful project outcome.


  • Indicators of Compromise TeslaCrypt Malware

    STI Graduate Student Research
    by Kevin Kelly – February 16, 2017 

    Malware has become a growing concern in a society of interconnected devices and realtime communications. This paper will show how to analyze live ransomware malware samples, how malware processes locally, over time and within the network. Analyzing live ransomware gives a unique three-dimensional perspective, visually locating crucial signatures and behaviors efficiently. In lieu of reverse engineering or parsing the malware executable’s infrastructure, live analysis provides a simpler method to root out indicators. Ransomware touches just about every file and many of the registry keys. Analysis can be done, but it needs to be focused. The analysis of malware capabilities from different datasets, including process monitoring, flow data, registry key changes, and network traffic will yield indicators of compromise. These indicators will be collected using various open source tools such as Sysinternals suite, Fiddler, Wireshark, and Snort, to name a few. Malware indicators of compromise will be collected to produce defensive countermeasures against unwanted advanced adversary activity on a network. A virtual appliance platform with simulated production Windows 8 O/S will be created, infected and processed to collect indicators to be used to secure enterprise systems. Different tools will leverage datasets to gather indicators, view malware on multiple layers, contain compromised hosts and prevent future infections.


  • Forensication Education: Towards a Digital Forensics Instructional Framework

    STI Graduate Student Research
    by J. Richard “Rick” Kiper – February 3, 2017 

    The field of digital forensics is a diverse and fast-paced branch of cyber investigations. Unfortunately, common efforts to train individuals in this area have been inconsistent and ineffective, as curriculum managers attempt to plug in off-the-shelf courses without an overall educational strategy. The aim of this study is to identify the most effective instructional design features for a future entry-level digital forensics course. To achieve this goal, an expert panel of digital forensics professionals was assembled to identify and prioritize the features, which included general learning outcomes, specific learning goals, instructional delivery formats, instructor characteristics, and assessment strategies. Data was collected from participants using validated group consensus methods such as Delphi and cumulative voting. The product of this effort was the Digital Forensics Framework for Instruction Design (DFFID), a comprehensive digital forensics instructional framework meant to guide the development of future digital forensics curricula.


  • Back to Basics: Focus on the First Six CIS Critical Security Controls

    Analyst Paper
    by John Pescatore – January 24, 2017 

    Rather than a lack of choices in security solutions, a major problem in cyber security is an inability to implement mature processes – many organizations lack a defined and repeatable process for selecting, implementing and monitoring the security controls that are most effective against real-world threats. This paper explores how the Center for Internet Security (CIS) Critical Security Controls has proven to be an effective framework for addressing that problem.


  • The SANS State of Cyber Threat Intelligence Survey: CTI Important and Maturing

    Analyst Paper
    by Dave Shackleford – August 15, 2016 

    It’s 2016, and the attacks (and attackers) continue to be more brazen than ever. In this threat landscape, the use of cyber threat intelligence (CTI) is becoming more important to IT security and response teams than ever before. This paper provides survey results along with advice and best practices for getting the most out of CTI.




  • 2016 State of Application Security: Skills, Configurations and Components

    Analyst Paper
    by Johannes Ullrich, PhD – April 26, 2016 

    Survey results reveal that it is critical for an overall enterprise security program to coordinate efforts among developers, architects and system administrators—particularly since many software vulnerabilities are rooted in configuration issues or third-party components, not just in code written by the development team. Read on to learn more.


  • Threat Hunting: Open Season on the Adversary

    Analyst Paper
    by Dr. Eric Cole – April 12, 2016 

    Nearly 86% of organizations responding to the survey want to be doing the hunting, albeit informally, as more than 40% do not have a formal threat hunting program in place. Results indicate that hunting is providing benefits, including finding previously undetected threats, reducing attack surfaces and enhancing the speed and accuracy of response by using threat hunting. They also suggest that organizations want to improve their threat-hunting programs and realize more benefits from threat hunting.


  • Securing Jenkins CI Systems

    STI Graduate Student Research
    by Allen Jeng – April 8, 2016 

    With over 100,000 active installations worldwide, Jenkins became the top choice for continuous integration and automation. A survey conducted by Cloudbees during the 2012 Jenkins Users Conference concluded that 83 percent of the respondents consider Jenkins to be mission critical. The November 2015 remotely exploitable Java deserialization vulnerability stresses the need to lock down and monitor Jenkins systems. Exploitation of this weakness enables hackers to gain access to critical assets such as source code that Jenkins manages. Enabling password security is the general recommendations for securing Jenkins. Unfortunately, this necessary security measure can easily be defeated with a packet sniffer because passwords are transmitted over the wire as clear text. This paper will look at ways to secure Jenkins system as well as the deployment of intrusion detection systems to monitor critical assets controlled by Jenkins CI systems.


  • Securing Jenkins CI Systems

    STI Graduate Student Research
    by Allen Jeng – April 8, 2016 

    With over 100,000 active installations worldwide, Jenkins became the top choice for continuous integration and automation. A survey conducted by Cloudbees during the 2012 Jenkins Users Conference concluded that 83 percent of the respondents consider Jenkins to be mission critical. The November 2015 remotely exploitable Java deserialization vulnerability stresses the need to lock down and monitor Jenkins systems. Exploitation of this weakness enables hackers to gain access to critical assets such as source code that Jenkins manages. Enabling password security is the general recommendations for securing Jenkins. Unfortunately, this necessary security measure can easily be defeated with a packet sniffer because passwords are transmitted over the wire as clear text. This paper will look at ways to secure Jenkins system as well as the deployment of intrusion detection systems to monitor critical assets controlled by Jenkins CI systems.




  • Password Management Applications and Practices

    by Scott Standridge – February 23, 2016 

    Password compromise is still the root cause behind many cyber breaches. In 2014
    two out of three breaches involved attackers using stolen or misused credentials (Higgins
    2014).




  • Practical Security Considerations for Managed Service Provider On-Premise Equipment

    STI Graduate Student Research
    by Mike Yeatman – October 5, 2015 

    Many organizations are not adequately staffed to perform 24×7 monitoring of network, systems infrastructure, and security activities such as vulnerability scanning and penetration testing. Use of third party managed service provider to fill this gap is on the rise. It is typical for managed service providers to require the implementation of an on premise device or appliance at the customer location(s). But, who watches the watcher? Service providers must be sure to fully harden any on–premise device placed on a customer network, and they must take steps to protect their own infrastructure against the propagation of an attack or compromise of the customer network and systems.
    Customers must be informed and work closely with service providers to assure proper placement of the on premise device such that it does not become a vector for compromise against the customer network. Collectively, and in accordance with a set of standards and guidelines, all stakeholders involved in the managed services relationship must be sure to set a sustainable benchmark that sufficiently reduces the chances for 3rd party on premise equipment becoming the root, or a contributing cause of a security compromise.


  • Breaking the Ice: Gaining Initial Access

    STI Graduate Student Research
    by Phillip Bosco – August 28, 2015 

    While companies are spending an increasing amount of resources on security equipment, attackers are still successful at finding ways to breach networks. This is a compounded problem with many moving parts, due to misinformation within the security industry and companies placing focus on areas of security that yield unimpressive results. A company cannot properly defend and protect against what they do not adequately understand, which tends to be a misunderstanding of their own security defense systems and relevant attacks that cyber criminals commonly use today. These misunderstandings result in attackers bypassing even the most seemingly robust security systems using the simplest methods. The author will outline the common misconceptions within the security industry that ultimately lead to insecure networks. Such misconceptions include a company’s misallocation of their security budget, while other misconceptions include the controversies regarding which methods are most effective at fending off an attacker. Common attack vectors and misconfigurations that are devastating, but are highly preventable, are also detailed.


  • Configuration Management with Windows PowerShell Desired State Configuration (DSC)

    STI Graduate Student Research
    by Brian E. Quick – August 18, 2015 

    Keeping information system baselines consistent with a formal configuration management plan can be a very difficult task. Changes to server based systems and networking must be monitored in order to provide some measure of compliance.
    A new distributed configuration management platform by Microsoft® called Desired State Configuration (DSC) makes this task easier.
    The objective of this paper is to describe in depth how PowerShell 4.0 can help to solve this common problem. DSC uses a declarative syntax that any skilled administrator can utilize to deploy software, monitor configuration drift and even report conformance. DSC is cross-platform compatible with hundreds of useful resources freely available. DSC leverages PowerShell 4.0 and gives administrators a useful way to automate configuration management.


  • Leveraging the Federal Public Trust Clearance Model in State Government Personnel Security Programs

    by Joseph C. Impinna – July 17, 2015 

    Security clearances are a requirement when working with classified information at the federal level. In recent years, incidents involving unauthorized disclosures of highly sensitive classified information have brought the security clearance adjudication process under scrutiny. These incidents have reinforced the principle that a personnel security program that properly vets individuals is critical to any organization that wishes to protect its data. Although the effects of an incident at the state level may be narrower in scope than at the federal level, the need to safeguard sensitive information is the same. The national security clearance model is used at many state agencies that work with the Department of Defense and other federal entities. However, an agency that does not access national security data still has a responsibility to uphold public trust. For these organizations, the background check processes can vary greatly from state to state or even between agencies.
    An effective personnel security program is much more than simply granting access to protected information through a public trust clearance. To achieve the assurance implied with a clearance, other components must be included. While a direct implementation of the federal model may not be feasible, using just a few concepts to design a system tailored to the state level would significantly improve the security posture of the issuing agency.


  • Securing Single Points of Compromise (SPoC)

    by David Belangia – June 30, 2015 

    Securing the Single Points of Compromise that provide central services to the
    institution’s environment is paramount to success when trying to protect the business. (Fisk, 2014) Time Based Security mandates protection (erecting and ensuring effective controls) that last longer than the time to detect and react to a compromise. When enterprise protections fail, providing additional layered controls for these central services provides more time to detect and react. While guidance is readily available for securing the individual critical asset, protecting these assets as a group is not often discussed. Using best business practices to protect these resources as individual assets while leveraging holistic defenses for the group increases the opportunity to maximize protection time, allowing detection and reaction time for the SPoCs that is commensurate with the inherent risk of these centralized services



  • Integration of Network Conversation Metadata with Asset and Configuration Management Databases

    by William Yeatman – May 26, 2015 

    As an alternative the loss of access to plaintext IP payloads in an increasingly encrypted and privacy conscious world, network layer security analysis requires a shift of attention to examination and characterization of the packet and network conversation meta- information derived from packet header information. These characteristics can be incorporated into and treated as an integral part of asset and configuration management baselines. Changes detected in the expected endpoints, frequency, duration, and packet sizes can be flagged for review and subsequent response or adjustment to the baseline.


  • Practical El Jefe

    by Charles Vedaa – March 31, 2015 

    “El Jefe is a free situational awareness tool that can drastically reduce the costs for securing your enterprise by making locating and responding to advanced threats incredibly easy.” (Immunity Inc., n.d.).





  • The Best Defenses Against Zero-day Exploits for Various-sized Organizations

    by David Hammarberg – October 27, 2014 

    Zero-day exploits are vulnerabilities that have yet to be publicly disclosed. These exploits are usually the most difficult to defend against because data is generally only available for analysis after the attack has completed its course. These vulnerabilities are highly sought after by cyber criminals, governments, and software vendors who will pay high prices for access to the exploit (Bilge & Dumitras, 2012).




  • Are there novel ways to mitigate credential theft attacks in Windows?

    by James Foster – August 13, 2014 

    Once a single system is compromised by a determined attacker in a Windows environment, the attacker often tries to move laterally through the environment and escalate his privileges, potentially resulting in compromise of additional systems, up to the entire domain or forest.



  • Simulating Cyber Operations: A Cyber Security Training Framework

    by Bryan K. Fite – February 14, 2014 

    The current shortage (Finkle & Randewich, 2012) of trained and experienced Cyber Operations Specialist coupled with the increasing threat (Sophos, 2013) posed by targeted attacks (Verizon, 2013) suggest more effective training methods must be considered.






  • Corporate vs. Product Security

    by Philip Watson – May 22, 2013 

    When people hear “I deal with security” from any employee, the typical thought is that they are defending the enterprise, the web servers, the corporate email, and corporate secrets.


  • Information Risks & Risk Management

    by John Wurzler – May 1, 2013 

    In a relatively short period of time, data in the business world has moved from paper files, carbon copies, and filing cabinets to electronic files stored on very powerful computers.




  • Securing Blackboard Learn on Linux

    by David Lyon – December 1, 2011 

    Blackboard Learn (Bb Learn) is an application suite providing educational technology
    to facilitate online, web based learning. It is typical to see Bb Learn hosting courses and
    content. Common add-ons include the Community and Content systems which are
    licensed separately.


  • Secure Browsing Environment

    STI Graduate Student Research
    by Robert Sorensen – September 21, 2011 

    Today’s computing environment is fraught with much treachery. It used to be that one could surf the web without any thought of infection or loss of private information. Those times have changed! One might argue that the safest connection to the web is no connection at all. However, thiS is not feasible in today’s social networked world (Powell, 2011). The target has only increased for hackers.


  • Using GUPI to Create A Null Box

    STI Graduate Student Research
    by Robert Comella – September 15, 2010 

    When an administrator builds a Linux server, they make many decisions. One of the most difficult is deciding which packages to install. Linux distributions, upon installation, try to pass package selection off as an easy choice. The administrator must simply choose a function from the list and the installation program will automatically install all the necessary software to provide that service. The installation usually works and the resulting machine performs the desired task. Administrators focused only on functionality consider themselves finished and move on to the next task.


  • A Guide to Virtualization Hardening Guides

    Analyst Paper
    by Dave Shackleford – May 20, 2010 

    A guide to the virtualization hardening guides that includes key configuration and system security settings for VMware ESX and vSphere/Virtual Infrastructure with key control areas organizations need to consider.


  • Writing a Penetration Testing Report

    by Mansour Alharbi – April 29, 2010 

    `A lot of currently available penetration testing resources lack report writing methodology and approach which leads to a very big gap in the penetration testing cycle. Report in its definition is a statement of the results of an investigation or of any matter on which definite information is required (Oxford English Dictionary).
    A penetration test is useless without something tangible to give to a client or executive officer. A report should detail the outcome of the test and, if you are making recommendations, document the recommendations to secure any high-risk systems (Whitaker & Newman, 2005). Report Writing is a crucial part for any service providers especially in IT service/ advisory providers. In pen-testing the final result is a report that shows the services provided, the methodology adopted, as well as testing results and recommendations. As one of the project managers at major electronics firm Said “We don’t actually manufacture anything. Most of the time, the tangible products of this department [engineering] are reports.” There is an old saying that in the consulting business: “If you do not document it, it did not happen.” (Smith, LeBlanc & Lam, 2004)



  • Effective Use Case Modeling for Security Information & Event Management

    by Daniel Frye – March 10, 2010 

    With today’s technology there exist many methods to subvert an information system which could compromise the confidentiality, integrity, or availability of the resource. Due to the abstract nature of modern computing, the only way to be reliably alerted of a system compromise is by reviewing the system’s actions at both the host and network layers and then correlating those two layers to develop a thorough view into the system’s actions. In most instances, the computer user often has no indication of the existence of the malicious software and therefore cannot be relied upon to determine if their system is indeed compromised.


  • Building Servers as Appliances for Improved Security

    by Algis Kibirkstis – March 8, 2010 

    Defense-in-Depth is a term commonly used when describing a layered model for protecting computing environments; by having multiple layers of protection, from the perimeter of the network to each computing system at the core, security-related failures at any single layer should not compromise the confidentiality, integrity or availability of the overall system. In this day and age, simple reliance on firewalls for protecting is generally considered to be imprudent (Brining, 2008), for they offer no network-level protection in case of failure, poor configuration, software misbehavior, or unauthorized access attempts posing as legitimate traffic; nor can they offer any protection if communications circumvent the firewall itself.


  • Preventing Incidents with a Hardened Web Browser

    by Chris Crowley – December 15, 2009 

    There is substantial industry documentation on web browser security because the web browser is currently a frequently used vector of attack. This paper investigates current literature discussing the threats present in today’s environment.



  • Building a Security Practice within a Mixed Product-R&D and Managed-Service Business

    by Evan Scheessele – July 27, 2007 

    Information-rich technology businesses offer their security staff more challenges today than ever. Where business is driven by active technology development and technology is delivered to customers in the form of a managed-service, security takes on a scope that impacts the business’s fundamentals. This paper addresses the challenges and best practices related to delivering overall security (here referred to as a security practice) within a complex business. The template business examined in this paper hosts both highly complex networked-product R&D and 24/7 outsourced managed services.



  • Sudo for Windows (sudowin)

    by Andrew Kutz – February 14, 2007 

    The original Sudo application was designed by Bob Coggeshall and Cliff Spencer in 1980 within the halls of the Department of Computer Science at SUNY/Buffalo. Sudo encourages the principal of least privilege that is, a user operates with a bare minimum number of privileges on a system until the user requests a higher level of privilege in order to accomplish some task.


  • Midrange & Mainframe systems for Security Policies compliance control Tool

    by Pierre Cailloux – February 12, 2005 

    The goal of this document, within the scope of the practical exam for the GSEC1 SANS2 option 2, is to present a solution for a Company, in order to be able to manage and apply computing security rules on Mainframe and Midrange systems, as well as Facilities Management systems complying with other security rules, specific to customers.


  • Network Security and the SMB

    by Matthew Hawley – January 28, 2005 

    Network security is an issue for all businesses. The challenges faced by small-to-medium size businesses (SMBs) are unique and significant.


  • Internal Security in a Engineering Development Environment

    by Art Homs – January 17, 2005 

    Organizations that design, develop, test, and support IP based products present unique security challenges in a converged services network. In an ideal scenario, engineering labs where these activities take place are insulated from the corporate environment to prevent interactions that can compromise corporate network confidentiality, integrity, and availability.


  • Host Assessment and Risk Rating

    by Radhika Vedaraman – August 28, 2004 

    Corporate websites get defaced; business activities of organizations get crippled; identity stolen; confidential information made public – all because of not securing information and resources, and not taking precautions necessary to protect against attacks.


  • Patch Management and the Need for Metrics

    by Ken MacLeod – August 28, 2004 

    The principle objective of `Patch Management and the Need for Metrics’ is to demonstrate that organisations cannot meaningfully assess their security posture; with reference to their patch status, without the use of appropriate metrics.




  • Beyond Patch Management

    by Dan Shauver – July 25, 2004 

    Systems maintenance, including operating system and software upgrades and patch management, has long been a major factor in security-related incidents. Application upgrades and patches can be equally necessary to system integrity, yet are equally likely to be ignored.



  • Printing the Paper and Serving the News after a Localized Disaster

    by John Soltys – June 9, 2004 

    A case study detailing the implementation of a business continuity plan for a regional newspaper. This study covers the requirements-gathering process, testing, and implementation of a series of plans jointly developed by members of the newsroom, IT, online staff, and operations.


  • The Art of Web Filtering

    by Robert Alvey – April 8, 2004 

    Web Filters are designed to improve the security and productivity of a network, but as with anything else, it must be implemented correctly to work properly. In order to ensure a Web Filter is implemented successfully, several factors need to be considered.



  • Securing the Network in a K-12 Public School Environment

    by Russell Penner – December 21, 2003 

    This paper addresses the K-12 public education data network environment which presents special needs and requirements, including privacy (confidentiality), data integrity, and content filtering.





  • Implementing Least Privilege at your Enterprise

    by Jeff Langford – September 4, 2003 

    This paper provides background on enterprise security, offers some rationale to help develop support for it’s acceptance, and identifies ways it can be implemented within your enterprise.


  • Federal Information Technology Management and Security

    by John Hopkins – September 4, 2003 

    This paper examines the long-standing vision of one senior OMB manager to re-enforce a seven year-old plan he helped draft that uses the Federal IT budget planning process to accomplish these three principal objectives.


  • Open Source Risk Mitigation Process

    by Carlos Casanova – August 22, 2003 

    The Open Source Risk Mitigation Process described in this paper, is a tool for corporations to use when trying to understand why a simple decision to use the “free” Open Source software should be taken very seriously.




  • A Guide to Government Security Mandates

    by Christian Enloe – May 8, 2003 

    This document endeavors to provide the reader with a solid understanding of the certification process, the order in which the steps should be completed, and some lessens learned from actual experience.




  • Securing an Application: A Paper on Plastic

    by Joe Rhode – February 28, 2003 

    This paper discusses the process of integrating a credit card application to the front end of already existing accounting and payments processing applications, the information risk analysis process needed and the action plan to implement the mitigated controls.


  • Designing a Secure Local Area Network

    by Daniel Oxenhandler – January 30, 2003 

    This paper examines of some of the issues in designing a secure Local Area Network (LAN) and some of the best practices suggested by security experts.


  • OpenVMS 7.2 Security Essentials

    by Jeff Leving – December 23, 2002 

    This paper attempts to build on the foundational article submitted by Steven Bourdon in March 2002 (Bourdon), by providing a security-focused overview of the basic tasks performed when installing a standalone OpenVMS server.


  • Securing Your RILOE Cards

    by Rick McCarter – November 27, 2002 

    This paper outlines the components of the RILOE, detailed features and functionality of the card, pre installation tips, physical installation instructions, physical setup instructions, and initial setup configuration parameters.


  • Secure Computing – An Elementary Issue

    by Susan Briere – October 28, 2002 

    This paper was developed as a resource for elementary school technical support personnel responsible for maintaining a safe and secure computing environment.


  • Securing Our Critical Infrastructures

    by Chris Brooks – October 11, 2002 

    In the event of a successful attack, limiting the amount of damage and quickly redistributing the assets to maintain a minimum essential infrastructure is critical in keeping the defense and national economy functioning.


  • Implementing an Effective IT Security Program

    by Kurt Garbars – August 28, 2002 

    The purpose of this paper is to take the wide variety of federal government laws, regulations, and guidance combined with industry best practices and define the essential elements of an effective IT security program


  • A Survival Guide for Security Professionals

    by Conrad Morgan – March 20, 2002 

    This survival guide aims to assist security professionals to balance the responsibilities and requirements of their role to avoid stress and burnout.


  • Who Wants To Be A Weakest Link?

    by Russell Hany – March 7, 2002 

    This paper emphasizes the need to convey good security practices throughout an organization, because the “weakest link” can be located anywhere along a company’s “chain.





  • Pre-Development Security Planning

    by Keith Marohn – August 13, 2001 

    This document will outline the basic steps that should be completed before code development begins to ensure delivery of a successful project.


Most of the computer security white papers in the Reading Room have been written by students seeking GIAC certification to fulfill part of their certification requirements and are provided by SANS as a resource to benefit the security community at large. SANS attempts to ensure the accuracy of information, but papers are published “as is”. Errors or inconsistencies may exist or may be introduced over time as material becomes dated. If you suspect a serious error, please contact webmaster@sans.org.

All papers are copyrighted. No re-posting or distribution of papers is permitted.

Source: SANS ISC SecNewsFeed @ March 1, 2017 at 01:30PM

0
Share