How Do I Intend To Accomplish A Mission?

How Do I Intend To Accomplish A Mission?

How do you intend to help accomplish our sorority’s mission of: Honoring our Past, Uniting our Present and Mentoring our Future?

One full page essay needed.

The post How Do I Intend To Accomplish A Mission? appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

Impact of Assigned Article Content on Future Practice

Impact of Assigned Article Content on Future Practice

Running head: SCHOLARLY PAPER PHASE 1 1

SCHOLARLY PAPER PHASE 1 3

Scholarly Paper Phase 1

Your Name (without credentials)

Chamberlain College of Nursing

NR351: Transitions in Professional Nursing

[insert session month and year here]

NOTE: No abstract

NOTE: This is a required template and guide and must be used for this assignment.

Delete all yellow highlighted words.

Scholarly Paper Phase 1 (paper title, begins on page 2)

(No heading of Introduction) This section is to remain blank for this assignment. Do NOT type in this section for this assignment.

Assigned Article Summary

Type statements that summarize the assigned article here. This paper should include a summary of the most important ideas in the assigned article. Most of these facts should be paraphrased (including proper citations). One or two short direct quotations (with appropriate citations) should be used in this section. There should be no prior knowledge, experience, or opinion in this section. All facts (both quoted and paraphrased) must originate from and be cited to the assigned article. No information should be included from other sources. See rubric for length limitations and other criteria.

Add paragraphs here as needed.

Impact of Assigned Article Content on Future Practice

Type statements here about the impact that the content of the assigned article will have on your future professional nursing practice. This portion of this paper should be your own ideas about how your own future practice will be impacted by content of the assigned article. Since ideas are your own, use of first person is appropriate and no citations are needed in this section. See rubric for length limitations and other criteria.

Add paragraphs here as needed.

Conclusion

This section is to remain blank for this assignment. Do NOT type in this section for this assignment.

References (centered, not bold)

Type your reference here using hanging indent and double line spacing (under “Paragraph” on the Home toolbar ribbon). See your APA Manual and the resources in the APA section of Course Resources under Modules for reference formatting.

The post Impact of Assigned Article Content on Future Practice appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

SCHOLARLY TASK PHASE 1

SCHOLARLY TASK PHASE 1

PREPARING THE SCHOLARLY PAPER PHASE 1

Carefully read these instructions and the Rubric.

Download the Week 4 Scholarly Paper Phase 1 Template (Links to an external site.)Links to an external site.. Use of the assigned template is required. Rename that document as Your Last Name Scholarly Paper Phase 1.docx, for example Smith Scholarly Paper Phase 1. Save it to your own computer or drive in a location where you will be able to retrieve it later.

Type your assignment directly on the saved template. You are required to complete the form using the productivity tools required by Chamberlain University, which is Microsoft Office Word 2013 (or later version), or Windows and Office 2011 (or later version) for MAC. You must save the file in the “.docx” format. Do NOT save as Word Pad. A later version of the productivity tool includes Office 365, which is available to Chamberlain students for FREE by downloading from the student portal at http://my.chamberlain.edu (Links to an external site.)Links to an external site. Click on the envelope at the top of the page. Remember that only Microsoft Word 2010 or a later version is acceptable. The document must be saved as a .docx. Save your work frequently as you type to prevent loss of your work.

The only resource for your paper is the following assigned article: Article link (Links to an external site.)Links to an external site.

Note: Logging in to the Chamberlain Library is needed to access this article. Use of the assigned article is required. You must click on the PDF Full Text link on the upper left portion of the page to download the correct version of this required article.

Follow the instructions and specifics on the assigned required template and the rubric. You will demonstrate your scholarly writing abilities as well as APA abilities in references, citations, quotations, and paraphrasing.

See rubric for length limitations for each section and other criteria.

Information below explains how to complete the Article Summary section of the paper (see Rubric for details).

Clearly summarize the major content of the assigned article using 175–200 words.

Content must include main ideas from across the entire article.

Specifics should be excellent.

Content must be attributed to the correct source.

For the Impact section (see rubric for details)

clearly state how learning from the assigned article will impact your future practice;

length must be 125–150 words;

writing must be concise and clearly relate the assigned article contents to practice; and

use first person in this section.

Double check your work with the rubric prior to submission.

Note: Assigned Template must be used for this assignment. The Assigned Template has been specially prepared to help you do well on this assignment. See #2 above.

Note: Assigned Article must be used for this assignment. Failure to do so may result in loss of points and/or Academic Integrity violation investigation.

The post SCHOLARLY TASK PHASE 1 appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

Grasping Evolving Technologies

Grasping Evolving Technologies

As we all know technology is evolving at a rate that, to some, seems overwhelming. These technologies often evolve to offer higher quality products and services at lower prices causing a disruption in markets that is sometimes perceived as unwelcome. These disruptive technologies are sometimes the results of innovative business models that are also part of the evolving processes of a competitive marketplace.

This is an individual research paper required from BA643 students.

As a Research Project, select one of the following research areas: Cloud Computing (Intranet, Extranet, and Internet), Machine Learning, Artificial Intelligence, Internet of Things (IoT), Robotics, or Medical Technology.

1) The research paper must only include materials from peer reviewed journals and peer reviewed conference proceedings. APA formatted citations are therefore required for the final submission. Newspapers, websites (URLs), magazines, technical journals, hearsay, personal opinions, and white papers are NOT acceptable citations.

2) Each submission will be checked for plagiarism. All plagiarized documents will results in a grade of zero for the exercise.

3) If there is extensive synonym use or not understandable, long sentences, the document will results in a grade of zero for the exercise.

4) The final research paper must include your through analysis and synthesis of the peer reviewed literature used in your research paper.

5) All images, tables, figures are to be included in the appendices and DO NOT count for page limit requirements.

6) Long quotations (i.e. paragraphs) are NOT permitted. Only one quoted sentence is permitted per page.

7) Footnotes are NOT permitted.

Document Details
Chapter 1 Introduction
Background/Introduction
In this section, present enough information about the proposed work such that the reader understands the general context or setting. It is also helpful to include a summary of how the rest of this document is organized.

Problem Statement
In this section, present a concise statement of a research-worthy problem addressed (i.e., why the work should be undertaken – don’t say required for the class). Follow the statement of the problem with a well-supported discussion of its scope and nature. The discussion of the problem should include: what the problem is, why it is a problem, how the problem evolved or developed, and the issues and events leading to the problem.

Goal
Next, include a concise definition of the goal of the work (i.e., what the work will accomplish). Aim to define a goal that is measurable.

Research Questions
Research questions are developed to help guide the authors through the literature for a given problem area. What were the open-ended questions asked and why did you find (or not find) them adequate.

Relevance and Significance
Consider the following questions as you read through the article and state how the author(s) supported, or left unsupported the relevance and significance of their research literature:

· Why is there a problem? What groups or individuals are affected?

· How far-ranging is the problem and how great is its impact? What’s the benefit of solving the problem?

· What has been tried without success to correct the situation? Why weren’t those attempts successful? What are the consequences of not solving the problem?

· How does the goal of your study address the research problem and how will your proposed study offer promise as a resolution to the problem?

· How will your research add to the knowledge base?

· What is the potential for generalization of your results?

· What is the potential for original work?

Barriers and Issues
In these paragraphs, identify how the problem is inherently difficult to solve. How did the solution the author(s) propose address the difficulties?

Chapter 2 Literature Review
In this section, it is important to clearly identify the major areas on which you will need to focus your research in order to build a solid foundation for your study in the existing body of knowledge. The literature review is the presentation of quality literature in a particular field that serves as the foundation and justification for the research problem, research questions or hypothesis, and methodology. You will develop a more comprehensive review of the literature as part of your research.

Chapter 3 Approach/Methodology
List the major steps taken to accomplish the goal of your study. Include a preliminary discussion of the methodology and specific research methods you plan to implement.

Chapter 4: Findings, Analysis, and Summary of Results
Include an objective description and analysis of the findings, results or outcomes of the research. Limit the use of charts, tables, figures to those that are needed to support the narrative. Most of these illustrations can be included as part of the Appendix.

  1. The following topics are intended to serve as a guide:

a. Data analysis

b. Findings & discussion

c. Analysis

d. Summary of results & discussion

Chapter 5: Conclusions
· Conclusions – Clearly state the conclusions of the study based on the analysis performed and results achieved. Indicate by the evidence or logical development the extent to which the specified objectives have been accomplished. If the research has been guided by hypotheses, make a statement as to whether the data supported or rejected these hypotheses. Discuss alternative explanations for the findings, if appropriate. Delineate strengths, weaknesses, and limitations of the study.

· Implications – Discuss the impact of the work on the field of study and its contributions to knowledge and professional practice. Discuss implications for future research.

· Recommendations – Present recommendations for future research or for changes in research methods or theoretical concepts. As appropriate, present recommendations for changes in academic practice, professional practice, or organizational procedures, practices, and behavior.

References
Follow the most current version of APA to format your references. However, each reference should be single-spaced with a double space in between each entry.

Formatting Details
Margins

The left-hand margin must be 1inches (4 cm.). Margins at the right, top, and bottom of the page should be 1.0 inch. (See exception for chapter title pages below.) The Research Report text may be left-aligned (leaving a ragged right edge) or may be both left- and right-aligned (justified).

Line Spacing

Double-spacing is required for most of the text in documents submitted during the Research Report process.

Paragraph Spacing

The text of the document is double-spaced. There should be no extra spaces between paragraphs in sections; however, indent the first line of each paragraphs five spaces.

Page Numbering

All pages should have page numbers in Arabic numerals in the upper right-hand corner.

Type Style

For body text, you should use 12-point Times New Roman. Text for the cover page may be larger but should not exceed 14-point size. Text for the chapter title text should be 14-point size. Be consistent in your use of typefaces throughout the document. Do not use a compressed typeface or any settings on your word processor that would decrease the spacing between letters or words. Sans serif typefaces such as Helvetica or Arial may be used for relatively short blocks of text such as chapter headings and captions but should be avoided in long passages of text as they impede readability.

Title Page

Every document that is submitted must have a title page. The title page includes the exact title of the research report, date of submission, your team name, and the name of each team member.

Chapter Title Heading, Subheadings, and Sub-Subheadings

It is required that submitted Research Report use no more than three levels of headings in the body text. All headings should have only the first letter of each word capitalized except that non-major words shorter than four letters have no capital letters.

Instructions for heading levels follow:

Level 1: Chapter Title Heading

This heading starts two inches from the top of the page, is centered on the page, and is set in 14point type. The first line contains the chapter number (e.g., Chapter 4). The second line is blank. The third line displays the chapter title, is centered on the page, and is set in 14-point type.

Level 2: Subheading

Start the subheading at the left margin of the page, four spaces (i.e., two returns when your document is set for double-spacing) down from the title, set in bold 12-point type. Double-space (one return) to the subheading body text. Indent the first line of the body text five spaces.

Level 3: Sub-Subheading

Start the sub–subheading at the left margin of the page, double-spaced (i.e., one return when your document is set up for double-spacing) from the subheading, set in 12-point italics. Double-space (one return) to the sub-subheading body text. Indent the first line of the body text five spaces.

5

The post Grasping Evolving Technologies appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

Current & Emerging Technology

Current & Emerging Technology

Research assignment

10 pages 7 hours no plagiarism

BA634 Current & Emerging Technology

Research Paper

Understanding Evolving Technologies

As we all know technology is evolving at a rate that, to some, seems overwhelming. These technologies often evolve to offer higher quality products and services at lower prices causing a disruption in markets that is sometimes perceived as unwelcome. These disruptive technologies are sometimes the results of innovative business models that are also part of the evolving processes of a competitive marketplace.

This is an individual research paper required from BA643 students.

As a Research Project, select one of the following research areas: Cloud Computing (Intranet, Extranet, and Internet), Machine Learning, Artificial Intelligence, Internet of Things (IoT), Robotics, or Medical Technology.

1) The research paper must only include materials from peer reviewed journals and peer reviewed conference proceedings. APA formatted citations are therefore required for the final submission. Newspapers, websites (URLs), magazines, technical journals, hearsay, personal opinions, and white papers are NOT acceptable citations.

2) Each submission will be checked for plagiarism. All plagiarized documents will results in a grade of zero for the exercise.

3) If there is extensive synonym use or not understandable, long sentences, the document will results in a grade of zero for the exercise.

4) The final research paper must include your through analysis and synthesis of the peer reviewed literature used in your research paper.

5) All images, tables, figures are to be included in the appendices and DO NOT count for page limit requirements.

6) Long quotations (i.e. paragraphs) are NOT permitted. Only one quoted sentence is permitted per page.

7) Footnotes are NOT permitted.

Document Details
Chapter 1 Introduction
Background/Introduction
In this section, present enough information about the proposed work such that the reader understands the general context or setting. It is also helpful to include a summary of how the rest of this document is organized.

Problem Statement
In this section, present a concise statement of a research-worthy problem addressed (i.e., why the work should be undertaken – don’t say required for the class). Follow the statement of the problem with a well-supported discussion of its scope and nature. The discussion of the problem should include: what the problem is, why it is a problem, how the problem evolved or developed, and the issues and events leading to the problem.

Goal
Next, include a concise definition of the goal of the work (i.e., what the work will accomplish). Aim to define a goal that is measurable.

Research Questions
Research questions are developed to help guide the authors through the literature for a given problem area. What were the open-ended questions asked and why did you find (or not find) them adequate.

Relevance and Significance
Consider the following questions as you read through the article and state how the author(s) supported, or left unsupported the relevance and significance of their research literature:

· Why is there a problem? What groups or individuals are affected?

· How far-ranging is the problem and how great is its impact? What’s the benefit of solving the problem?

· What has been tried without success to correct the situation? Why weren’t those attempts successful? What are the consequences of not solving the problem?

· How does the goal of your study address the research problem and how will your proposed study offer promise as a resolution to the problem?

· How will your research add to the knowledge base?

· What is the potential for generalization of your results?

· What is the potential for original work?

Barriers and Issues
In these paragraphs, identify how the problem is inherently difficult to solve. How did the solution the author(s) propose address the difficulties?

Chapter 2 Literature Review
In this section, it is important to clearly identify the major areas on which you will need to focus your research in order to build a solid foundation for your study in the existing body of knowledge. The literature review is the presentation of quality literature in a particular field that serves as the foundation and justification for the research problem, research questions or hypothesis, and methodology. You will develop a more comprehensive review of the literature as part of your research.

Chapter 3 Approach/Methodology
List the major steps taken to accomplish the goal of your study. Include a preliminary discussion of the methodology and specific research methods you plan to implement.

Chapter 4: Findings, Analysis, and Summary of Results
Include an objective description and analysis of the findings, results or outcomes of the research. Limit the use of charts, tables, figures to those that are needed to support the narrative. Most of these illustrations can be included as part of the Appendix.

  1. The following topics are intended to serve as a guide:

a. Data analysis

b. Findings & discussion

c. Analysis

d. Summary of results & discussion

Chapter 5: Conclusions
· Conclusions – Clearly state the conclusions of the study based on the analysis performed and results achieved. Indicate by the evidence or logical development the extent to which the specified objectives have been accomplished. If the research has been guided by hypotheses, make a statement as to whether the data supported or rejected these hypotheses. Discuss alternative explanations for the findings, if appropriate. Delineate strengths, weaknesses, and limitations of the study.

· Implications – Discuss the impact of the work on the field of study and its contributions to knowledge and professional practice. Discuss implications for future research.

· Recommendations – Present recommendations for future research or for changes in research methods or theoretical concepts. As appropriate, present recommendations for changes in academic practice, professional practice, or organizational procedures, practices, and behavior.

References
Follow the most current version of APA to format your references. However, each reference should be single-spaced with a double space in between each entry.

Formatting Details
Margins

The left-hand margin must be 1inches (4 cm.). Margins at the right, top, and bottom of the page should be 1.0 inch. (See exception for chapter title pages below.) The Research Report text may be left-aligned (leaving a ragged right edge) or may be both left- and right-aligned (justified).

Line Spacing

Double-spacing is required for most of the text in documents submitted during the Research Report process.

Paragraph Spacing

The text of the document is double-spaced. There should be no extra spaces between paragraphs in sections; however, indent the first line of each paragraphs five spaces.

Page Numbering

All pages should have page numbers in Arabic numerals in the upper right-hand corner.

Type Style

For body text, you should use 12-point Times New Roman. Text for the cover page may be larger but should not exceed 14-point size. Text for the chapter title text should be 14-point size. Be consistent in your use of typefaces throughout the document. Do not use a compressed typeface or any settings on your word processor that would decrease the spacing between letters or words. Sans serif typefaces such as Helvetica or Arial may be used for relatively short blocks of text such as chapter headings and captions but should be avoided in long passages of text as they impede readability.

Title Page

Every document that is submitted must have a title page. The title page includes the exact title of the research report, date of submission, your team name, and the name of each team member.

Chapter Title Heading, Subheadings, and Sub-Subheadings

It is required that submitted Research Report use no more than three levels of headings in the body text. All headings should have only the first letter of each word capitalized except that non-major words shorter than four letters have no capital letters.

Instructions for heading levels follow:

Level 1: Chapter Title Heading

This heading starts two inches from the top of the page, is centered on the page, and is set in 14point type. The first line contains the chapter number (e.g., Chapter 4). The second line is blank. The third line displays the chapter title, is centered on the page, and is set in 14-point type.

Level 2: Subheading

Start the subheading at the left margin of the page, four spaces (i.e., two returns when your document is set for double-spacing) down from the title, set in bold 12-point type. Double-space (one return) to the subheading body text. Indent the first line of the body text five spaces.

Level 3: Sub-Subheading

Start the sub–subheading at the left margin of the page, double-spaced (i.e., one return when your document is set up for double-spacing) from the subheading, set in 12-point italics. Double-space (one return) to the sub-subheading body text. Indent the first line of the body text five spaces.

5

The post Current & Emerging Technology appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

Report to Congressional Committees

Report to Congressional Committees

TRANSPORTATION WORKER IDENTIFICATION CREDENTIAL

Card Reader Pilot Results Are Unreliable; Security Benefits Need to Be Reassessed

Report to Congressional Committees

May 2013

GAO-13-198

United States Government Accountability Office

United States Government Accountability Office

Highlights of GAO-13-198, a report to congressional committees

May 2013

TRANSPORTATION WORKER IDENTIFICATION CREDENTIAL Card Reader Pilot Results Are Unreliable; Security Benefits Need to Be Reassessed

Why GAO Did This Study

Within DHS, TSA and USCG manage the TWIC program, which requires maritime workers to complete background checks and obtain biometric identification cards to gain unescorted access to secure areas of Maritime Transportation Security Act (MTSA)-regulated entities. TSA conducted a pilot program to test the use of TWICs with biometric card readers in part to inform the development of a regulation on using TWICs with card readers. As required by law, DHS reported its findings on the pilot to Congress on February 27, 2012. The Coast Guard Authorization Act of 2010 required that GAO assess DHS’s reported findings and recommendations. Thus, GAO assessed the extent to which the results from the TWIC pilot were sufficiently complete, accurate, and reliable for informing Congress and the proposed TWIC card reader rule. GAO reviewed pilot test plans, results, and methods used to collect and analyze pilot data since August 2008, compared the pilot data with the pilot report DHS submitted to Congress, and conducted covert tests at four U.S. ports chosen for their geographic locations. The test’s results are not generalizable, but provide insights.

What GAO Recommends

Congress should halt DHS’s efforts to promulgate a final regulation until the successful completion of a security assessment of the effectiveness of using TWIC. In addition, GAO revised the report based on the March 22, 2013, issuance of the TWIC card reader notice of proposed rulemaking.

What GAO Found

GAO’s review of the pilot test aimed at assessing the technology and operational impact of using the Transportation Security Administration’s (TSA) Transportation Worker Identification Credential (TWIC) with card readers showed that the test’s results were incomplete, inaccurate, and unreliable for informing Congress and for developing a regulation (rule) about the readers. Challenges related to pilot planning, data collection, and reporting affected the completeness, accuracy, and reliability of the results. These issues call into question the program’s premise and effectiveness in enhancing security.

Planning. The Department of Homeland Security (DHS) did not correct planning shortfalls that GAO identified in November 2009. GAO determined that these weaknesses presented a challenge in ensuring that the pilot would yield information needed to inform Congress and the regulation aimed at defining how TWICs are to be used with biometric card readers (card reader rule). GAO recommended that DHS components implementing the pilot—TSA and the U.S. Coast Guard (USCG)—develop an evaluation plan to guide the remainder of the pilot and identify how it would compensate for areas where the TWIC reader pilot would not provide the information needed. DHS agreed and took initial steps, but did not develop an evaluation plan, as GAO recommended.

Data collection. Pilot data collection and reporting weaknesses include:

• Installed TWIC readers and access control systems could not collect required data, including reasons for errors, on TWIC reader use, and TSA and the independent test agent (responsible for planning, evaluating, and reporting on all test events) did not employ effective compensating data collection measures, such as manually recording reasons for errors in reading TWICs.

• TSA and the independent test agent did not record clear baseline data for comparing operational performance at access points with TWIC readers.

• TSA and the independent test agent did not collect complete data on malfunctioning TWIC cards.

• Pilot participants did not document instances of denied access.

TSA officials said challenges, such as readers incapable of recording needed data, prevented them from collecting complete and consistent pilot data. Thus, TSA could not determine whether operational problems encountered at pilot sites were due to TWIC cards, readers, or users, or a combination of all three.

Issues with DHS’s report to Congress and validity of TWIC security premise. DHS’s report to Congress documented findings and lessons learned, but its reported findings were not always supported by the pilot data, or were based on incomplete or unreliable data, thus limiting the report’s usefulness in informing Congress about the results of the TWIC reader pilot. For example, reported entry times into facilities were not based on data collected at pilot sites as intended. Further, the report concluded that TWIC cards and readers provide a critical layer of port security, but data were not collected to support this conclusion. For example, DHS’s assumption that the lack of a common credential could leave facilities open to a security breach with falsified credentials has not been validated. Eleven years after initiation, DHS has not demonstrated how, if at all, TWIC will improve maritime security.

View GAO-13-198. For more information, contact Stephen M. Lord at (202) 512-4379 or lords@gao.gov.

http://www.gao.gov/products/GAO-13-198�
http://www.gao.gov/products/GAO-13-198�
mailto:lords@gao.gov�
Page i GAO-13-198 TWIC Reader Pilot Review

Letter 1

Background 6 TWIC Reader Pilot Results Are Not Sufficiently Complete,

Accurate, and Reliable for Informing Congress and the TWIC Card Reader Rule 13

Conclusions 42 Matter for Congressional Consideration 43 Agency Comments and Our Evaluation 43

Appendix I Objective, Scope, and Methodology 48

Appendix II Key TWIC Implementation Actions 58

Appendix III TWIC Program Funding 59

Appendix IV TWIC Reader Pilot Sites, Locations, and Types of Maritime Operation or Industry Group 61

Appendix V Comments from the Department of Homeland Security 62

Appendix VI GAO Contact and Staff Acknowledgments 67

Tables

Table 1: Three Assessments Planned for the Transportation Worker Identification Credential (TWIC) Reader Pilot 11

Table 2: Weaknesses in the Transportation Worker Identification Credential (TWIC) Reader Pilot Affecting the Completeness, Accuracy, and Reliability of Data Collected 19

Table 3: Key Transportation Worker Identification Credential (TWIC) Program Laws and Implementation Actions from November 2002 through November 2012 58

Contents

Page ii GAO-13-198 TWIC Reader Pilot Review

Table 4: Transportation Worker Identification Credential (TWIC) Program Funding from Fiscal Years 2002 through 2012 59

Abbreviations ATSA Aviation and Transportation Security Act DHS Department of Homeland Security DOD Department of Defense EOA early operational assessment FEMA Federal Emergency Management Agency FIPS Federal Information Processing Standard ICE initial capability evaluation ITT initial technical testing MSRAM Maritime Security Risk Analysis Model MTSA Maritime Transportation Security Act of 2002 NAVAIR Naval Air Systems Command NPRM Notice of Proposed Rulemaking TSA Transportation Security Administration TWIC Transportation Worker Identification Credential USCG U.S. Coast Guard SAFE Port Act Security and Accountability For Every Port Act of 2006 SOVT Systems Operational Verification Testing SPAWAR Space and Naval Warfare Systems Command ST&E system test and evaluation TEMP Test and Evaluation Master Plan

This is a work of the U.S. government and is not subject to copyright protection in the United States. The published product may be reproduced and distributed in its entirety without further permission from GAO. However, because this work may contain copyrighted images or other material, permission from the copyright holder may be necessary if you wish to reproduce this material separately.

Page 1 GAO-13-198 TWIC Reader Pilot Review

441 G St. N.W. Washington, DC 20548

May 8, 2013

Congressional Committees

Ports, waterways, and vessels handle billions of dollars in cargo annually, and an attack on our nation’s maritime transportation system could have serious consequences. Maritime workers, including longshoremen, mechanics, truck drivers, and merchant mariners, access secure areas of the nation’s estimated 16,400 maritime-related transportation facilities and vessels, such as cargo container and cruise ship terminals, each day while performing their jobs.1 Securing transportation systems and maritime-related facilities requires balancing security to address potential threats while facilitating the flow of people and goods that are critical to the U.S. economy and necessary for supporting international commerce. As we have previously reported, these systems and facilities are vulnerable and difficult to secure given their size, easy accessibility, large number of potential targets, and proximity to urban areas.2

The Department of Homeland Security’s (DHS) Transportation Worker Identification Credential (TWIC) program was initiated in December 2001 in response to the September 11, 2001, terrorist attacks. The TWIC program is intended to provide a tamper-resistant biometric credential

3

1For the purposes of this report, the term “maritime-related transportation facilities” refers to seaports, inland ports, offshore facilities, and facilities located on the grounds of ports.

to maritime workers who require unescorted access to secure areas of facilities and vessels regulated under the Maritime Transportation

2See, for example, GAO, Transportation Worker Identification Credential: Internal Control Weaknesses Need to Be Corrected to Help Achieve Security Objectives, GAO-11-657 (Washington, D.C.: May 10, 2011); Transportation Worker Identification Credential: Progress Made in Enrolling Workers and Activating Credentials but Evaluation Plan Needed to Help Inform the Implementation of Card Readers, GAO-10-43 (Washington, D.C.: Nov. 18, 2009); and Port Security: Better Planning Needed to Develop and Operate Maritime Worker Identification Card Program, GAO-05-106 (Washington, D.C.: Dec. 10, 2004). 3A biometric access control system consists of technology that determines an individual’s identity by detecting and matching unique physical or behavioral characteristics, such as fingerprint or voice patterns, as a means of verifying personal identity.

http://www.gao.gov/products/GAO-11-657�
http://www.gao.gov/products/GAO-11-657�
http://www.gao.gov/products/GAO-10-43�
http://www.gao.gov/products/GAO-05-106�
Page 2 GAO-13-198 TWIC Reader Pilot Review

Security Act of 2002 (MTSA).4 TWIC is to enhance the ability of MTSA- regulated facility and vessel owners and operators to control access to their facilities and verify workers’ identities. Under current statute and regulation, maritime workers requiring unescorted access to secure areas of MTSA-regulated facilities or vessels are required to obtain a TWIC,5 and facility and vessel operators are required by regulation to visually inspect each worker’s TWIC before granting unescorted access.6

Within DHS, the Transportation Security Administration (TSA) and the U.S. Coast Guard (USCG) jointly administer the TWIC program. TSA is responsible for enrollment, security threat assessments, and TWIC enrollee data systems operations and maintenance. USCG is responsible for the enforcement of regulations governing the use of TWICs at MTSA- regulated facilities and vessels. In addition, DHS’s Screening Coordination Office facilitates coordination among the various DHS components involved in TWIC, such as TSA and USCG.

Prior to being granted a TWIC, maritime workers are required to undergo a background check, known as a security threat assessment.

7

4Pub. L. No. 107-295,116 Stat. 2064. According to Coast Guard regulations, a secure area is an area that has security measures in place for access control. 33 C.F.R. § 101.105. For most maritime facilities, the secure area is generally any place inside the outermost access control point. For a vessel or outer continental shelf facility, such as offshore petroleum or gas production facilities, the secure area is generally the whole vessel or facility. A restricted area is a part of a secure area that needs more limited access and higher security. Under Coast Guard regulations, an owner/operator must designate certain specified types of areas as restricted. For example, storage areas for cargo are restricted areas under Coast Guard regulations. 33 C.F.R. § 105.260(b)(7).

As of November 2012, TSA operates approximately 135 centers where workers can enroll in the program and pick up their TWIC cards. These centers are located in ports and in areas where there are concentrations of maritime activity throughout the United States and its territories. As of April 11, 2013, TSA has issued nearly 2.3 million TWICs.

546 U.S.C. § 70105(a); 33 C.F.R. § 101.514. 633 C.F.R. §§ 104.265(c), 105.255(c). 7DHS’s Screening Coordination Office was established in 2006 to coordinate and harmonize the numerous and disparate credentialing and screening initiatives within DHS.

Page 3 GAO-13-198 TWIC Reader Pilot Review

We have been reporting on TWIC progress and challenges since September 2003.8 Among other issues, we highlighted steps that TSA and USCG were taking to meet an expected surge in initial enrollment as well as various challenges experienced in the TWIC testing conducted by a contractor for TSA and USCG from August 2004 through June 2005. We also identified challenges related to ensuring that the TWIC technology works effectively in the harsh maritime environment.9 In November 2009, we reported on the design and approach of a pilot initiated in August 2008 to test TWIC readers, and found that DHS did not have a sound evaluation methodology to ensure information collected through the TWIC reader pilot would be complete and accurate.10 Moreover, in May 2011, we reported that internal control weaknesses governing the enrollment, background checking, and use of TWIC potentially limit the program’s ability to provide reasonable assurance that access to secure areas of MTSA-regulated facilities is restricted to qualified individuals.11

USCG is leading efforts to develop a new TWIC regulation (rule) regarding the use of TWIC cards with readers (known as the TWIC card reader rule). The TWIC card reader rule is expected to define if and under what circumstances facility and vessel owners and operators are to use electronic card readers to verify that a TWIC card is valid. To help inform this rulemaking and to fulfill the Security and Accountability For Every Port Act of 2006 (SAFE Port Act) requirement, TSA conducted a TWIC reader pilot from August 2008 through May 2011 to test a variety of biometric readers, as well as the credential authentication and validation process.

Additional information on our past work and related recommendations is discussed later in this report.

12

8GAO, Maritime Security: Progress Made in Implementing Maritime Transportation Security Act, but Concerns Remain,

The TWIC reader pilot, implemented with the voluntary participation of maritime port, facility, and vessel operators, was to test

GAO-03-1155T (Washington, D.C.: Sept. 9, 2003). 9GAO, Transportation Security: DHS Should Address Key Challenges before Implementing the Transportation Worker Identification Credential Program, GAO-06-982 (Washington, D.C.: Sept. 29, 2006). TWIC readers and related technologies operated outdoors in the harsh maritime environment can be affected by dirt, salt, wind, and rain. 10GAO-10-43. 11GAO-11-657. 12Pub. L. No 109-347, § 104(a), 120 Stat. 1884, 1888 (codified at 46 U.S.C. § 70105(k)).

http://www.gao.gov/products/GAO-03-1155T�
http://www.gao.gov/products/GAO-06-982�
http://www.gao.gov/products/GAO-06-982�
http://www.gao.gov/products/GAO-10-43�
http://www.gao.gov/products/GAO-11-657�
Page 4 GAO-13-198 TWIC Reader Pilot Review

the technology, business processes, and operational impacts of deploying card readers at maritime facilities and vessels prior to issuing a final rule.13 Among other things, the SAFE Port Act required that DHS submit a report on the findings of the pilot program to Congress.14 DHS submitted its report to Congress on the findings of the TWIC reader pilot on February 27, 2012.15

The Coast Guard Authorization Act of 2010 required that the TWIC reader pilot report include, among other things, a comprehensive listing of the extent to which established metrics were achieved during the pilot program and that among other things, GAO conduct an assessment of the report’s findings and recommendations.

16

• To what extent were the results from the TWIC reader pilot sufficiently complete, accurate, and reliable for informing Congress and the TWIC card reader rule?

To meet this requirement, we addressed the following question:

To conduct our work, we assessed TWIC reader pilot test plans and results, as well as DHS’s February 2012 report to Congress on the results of the TWIC reader pilot. We reviewed the extent to which pilot test plans were updated and used since we reported on them in November 2009.17

13The SAFE Port Act required the Secretary to conduct a pilot program to test the business processes, technology, and operational impacts required to deploy transportation security card readers at secure areas of the maritime transportation system. 46 U.S.C. § 70105(k)(1)(A).

We also assessed the methods used to collect and analyze pilot data since the inception of the pilot in August 2008. We analyzed and compared the pilot data with the TWIC reader pilot report submitted to

1446 U.S.C. § 70105(k)(4). 15Department of Homeland Security, Transportation Worker Identification Credential Reader Pilot Program: In accordance with Section 104 of the Security and Accountability For Every Port Act of 2006, P.L. 109-347 (SAFE Port Act) Final Report. Feb. 17, 2012. 16Pub. L. No. 111-281, § 802, 124 Stat. 2905, 2989. Specifically, the report was to include (1) the findings of the pilot program with respect to key technical and operational aspects of implementing TWIC technologies in the maritime sector; (2) a comprehensive listing of the extent to which established metrics were achieved during the pilot program; and (3) an analysis of the viability of those technologies for use in the maritime environment, including any challenges to implementing those technologies and strategies for mitigating identified challenges. 17GAO-10-43.

http://www.gao.gov/products/GAO-10-43�
Page 5 GAO-13-198 TWIC Reader Pilot Review

Congress to determine whether the findings in the report are based on sufficiently complete, accurate, and reliable data. In doing so, we reviewed TWIC reader pilot site reports from all of the sites and the underlying data to assess the extent to which data in these reports were consistent and complete. Additionally, we interviewed officials at DHS, TSA, and USCG with responsibilities for overseeing the TWIC program, as well as pilot officials responsible for coordinating pilot efforts with TSA and the independent test agent, about TWIC reader pilot testing approaches, results, and challenges. We compared the TWIC reader pilot effort with requirements in MTSA, the SAFE Port Act, and the Coast Guard Authorization Act of 2010. We further assessed the effort, including data collection and reporting, against established practices for designing evaluations and assessing the reliability of computer-processed data as well as internal control standards for collecting and maintaining records.18 Our investigators also conducted limited covert testing of TWIC program internal controls for acquiring and using TWIC cards at four maritime ports to update our understanding of the effectiveness of TWIC at enhancing maritime security since we reported on these issues in May 2011.19 The information we obtained from the four maritime ports is not generalizable across the maritime transportation industry as a whole, but provided additional perspectives and context on the TWIC program. Finally, we reviewed and assessed the security benefits presented in the TWIC reader notice of proposed rulemaking (NPRM) issued March 22, 2013, to determine whether the effectiveness of the noted security benefits were presented.20

We conducted this performance audit from January 2012 to May 2013 in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our

For additional details on our scope and methodology, see appendix I.

18GAO, Designing Evaluations: 2012 Revision, GAO-12-208G (Washington, D.C.: Jan. 31, 2012); Assessing the Reliability of Computer Processed Data, GAO-09-680G (Washington, D.C.: July 1, 2009); and GAO/AIMD-00-21.3.1, Standards for Internal Control in the Federal Government (Washington, D.C.: Nov. 1,1999). 19See GAO-11-657. The four ports tested as part of this limited covert testing update were selected because (1) we conducted covert testing at these locations during our prior review and (2) they are geographically dispersed across the United States, representing the East Coast, South, and Southwest. 2078 Fed. Reg. 17,782 (Mar. 22, 2013).

http://www.gao.gov/products/GAO-12-208G�
http://www.gao.gov/products/GAO-09-680G�
http://www.gao.gov/products/GAO-09-680G�
http://www.gao.gov/products/GAO/AIMD-00-21.3.1�
http://www.gao.gov/products/GAO-11-657�
Page 6 GAO-13-198 TWIC Reader Pilot Review

findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objective. We conducted our related investigative work in accordance with standards prescribed by the Council of the Inspectors General on Integrity and Efficiency.

Following the terrorist attacks of September 11, 2001, the Aviation and Transportation Security Act (ATSA) was enacted in November 2001 and required TSA to work with airport operators to strengthen access controls to secure areas, and to consider using biometric access control systems, or similar technologies, to verify the identity of individuals who seek to enter a secure airport area.21 In response, TSA established the TWIC program in December 2001.22 TWIC was originally envisioned as a nationwide transportation worker identity solution to be used by approximately 6 million credential holders across all modes of transportation, including seaports, airports, rail, pipeline, trucking, and mass transit facilities. In November 2002, MTSA further required DHS to issue a maritime worker identification card that uses biometrics to control access to secure areas of maritime transportation facilities and vessels.23

As defined by DHS, and consistent with the requirements of MTSA, the purpose of the TWIC program is to design and field a common biometric credential for all transportation workers across the United States who require unescorted access to secure areas at MTSA-regulated maritime facilities and vessels. As stated in the TWIC mission needs statement, the TWIC program aims to meet the following mission needs:

TSA and USCG decided to implement TWIC initially in the maritime domain. Other transportation modes such as aviation have a preference for site-specific credentials.

21Pub. L. No. 107-71, § 106(c)(4), 115 Stat. 597, 610 (2001). 22TSA was transferred from the Department of Transportation to DHS pursuant to requirements in the Homeland Security Act of 2002 enacted on November 25, 2002. Pub. L. No. 107-296, 116 Stat. 2135. 2346 U.S.C. § 70105.

Background

TWIC Program History and Our Prior Related Work

Page 7 GAO-13-198 TWIC Reader Pilot Review

• positively identify authorized individuals who require unescorted access to secure areas of the nation’s transportation system,

• determine the eligibility of individuals to be authorized unescorted access to secure areas of the transportation system by conducting a security threat assessment,

• ensure that unauthorized individuals are not able to defeat or otherwise compromise the access system in order to be granted permissions that have been assigned to an authorized individual, and

• identify individuals who fail to maintain their eligibility requirements subsequent to being permitted unescorted access to secure areas of the nation’s transportation system and immediately revoke the individual’s permissions.24

In 2005, TSA conducted an analysis of alternatives and a cost-benefit analysis to identify possible options for addressing MTSA’s requirement to develop a biometric transportation security card that would also meet the related mission needs specified above.

25

To help evaluate the TWIC program concept, TSA—through a private contractor—conducted prototype testing in 2004 and 2005 at 28 transportation facilities around the nation. However, in September 2006, we reported on the testing conducted by the contractor and identified challenges related to ensuring that the TWIC technology, such as biometric card readers, works effectively in the harsh maritime

On the basis of these analyses, TSA determined that the best alternative was for the federal government to issue a single biometric credential that could be used across all vessels and maritime facilities, and for the government to manage all aspects of the credentialing process—enrollment, card issuance, and card revocation. TSA considered an alternative option based on a more decentralized and locally managed approach wherein MTSA-regulated facilities, vessels, and other port-related entities could issue their own credentials after individuals passed a TSA security threat assessment, but ultimately rejected the option (additional details are provided later in this report).

24Transportation Security Administration, Mission Need Statement for the Transportation Worker Identification Credential (TWIC) Program, Sept. 20, 2006. 25Transportation Security Administration. Transportation Worker Identification Credential (TWIC) Program Analysis of Alternatives Version 2.0. Feb. 15, 2005, and Transportation Worker Identification Credential (TWIC) Program Cost Benefit Analysis, Version 1.0. Aug. 31, 2005.

Page 8 GAO-13-198 TWIC Reader Pilot Review

environment.26

In 2006, the SAFE Port Act amended MTSA and directed the Secretary of Homeland Security to, among other things, implement a TWIC reader pilot to test the technology and operational impacts of deploying card readers at maritime facilities and vessels.

We found that an independent assessment of the testing contractor’s report identified problems with the report, such as inaccurate and missing information. As a result, the independent assessment recommended that TSA not rely on the contractor’s final report on the TWIC prototype when making future decisions about the implementation of TWIC.

27 TSA initiated the pilot in August 2008.28 This pilot was conducted with the voluntary participation of maritime port, facility, and vessel operators at 17 sites within the United States. In November 2009, we reported on the TWIC reader pilot design and planned approach, and found that DHS did not have a sound evaluation approach to ensure information collected through the TWIC reader pilot would be complete, accurate, and representative of deployment conditions.29 Among other things, we recommended that an evaluation plan and data analysis plan be developed to guide the remainder of the pilot and to identify how DHS would compensate for areas where the TWIC reader pilot would not provide the information needed to report to Congress and implement the TWIC card reader rule. DHS concurred with this recommendation. The status of TSA’s efforts to develop these plans is discussed later in this report. In addition, the Coast Guard Authorization Act of 2010 required that the findings of the pilot be included in a report to Congress, and that we assess the reported findings and recommendations.30

26

GAO-06-982. 27Pub. L. No 109-347, § 104(a), 120 Stat. 1884, 1888 (codified at 46 U.S.C. § 70105(k)). 28According to TSA, there were several factors that contributed to delays in commencing the pilot. These included (1) the voluntary nature of participation; (2) the first TWIC readers were not available for testing until July 2008, resulting in a 15-month delay in commencing the pilot; and (3) some facilities could acquire equipment or services quickly, while others required extensive bid processes or board of directors’ approval. 29GAO-10-43. 30Pub. L. No. 111-281, § 802, 124 Stat. 2905, 2989.

http://www.gao.gov/products/GAO-06-982�
http://www.gao.gov/products/GAO-10-43�
Page 9 GAO-13-198 TWIC Reader Pilot Review

In May 2011, we reported that internal control weaknesses governing the enrollment, background checking, and use of TWIC potentially limited the program’s ability to provide reasonable assurance that access to secure areas of MTSA-regulated facilities is restricted to qualified individuals.31

According to DHS documents, from fiscal year 2002 through fiscal year 2012, the TWIC program had funding and fee authority totaling $393.4 million and the TWIC reader pilot cost approximately $23 million.

We also reported that DHS had not assessed the TWIC program’s effectiveness at enhancing security or reducing risk for MTSA-regulated facilities and vessels. Further, we reported that DHS had not conducted a risk-informed cost-benefit analysis that considered existing security risks. We recommended, among other things, that DHS (1) assess TWIC program internal controls to identify needed corrective actions; (2) assess the TWIC program’s effectiveness; and (3) use the information from the assessment as the basis for evaluating the costs, benefits, security risks, and corrective actions needed to implement the TWIC program in a manner that will meet program objectives and mitigate existing security risks. DHS concurred with our recommendations and has taken steps to assess TWIC program internal controls. Appendix II summarizes key activities in the implementation of the TWIC program.

32 In issuing the credential rule, which required each maritime worker seeking unescorted access to secure areas of MTSA-regulated facilities and vessels to possess a TWIC, DHS estimated that implementing the TWIC program could cost the federal government and the private sector a combined total of between $694.3 million and $3.2 billion over a 10-year period.33

31

However, these figures did not include costs associated with implementing and operating card readers, as the credential rule did not require the installation or use of TWIC cards with readers. The notice of

GAO-11-657. 32Over $23 million had been made available to pilot participants from two Federal Emergency Management Agency (FEMA) grant programs—the Port Security Grant Program and the Transit Security Grant Program. Of the $23 million, grant recipients agreed to spend nearly $15 million on the TWIC reader pilot. However, DHS is unable to validate the exact amount grant recipients spent on the TWIC reader pilot, as rules for allocating what costs would be included as TWIC reader pilot costs versus other allowable grant expenditures were not defined. Sixteen of the 17 participating pilot sites were funded using these grants. In addition, TSA obligated an additional $8.1 million of appropriated funds to support the pilot. 3372 Fed. Reg. 3492, 3571 (Jan. 25, 2007).

http://www.gao.gov/products/GAO-11-657�
Page 10 GAO-13-198 TWIC Reader Pilot Review

proposed rulemaking published on March 22, 2013, estimated an additional cost of $234.2 million (undiscounted) to implement readers at 570 facilities and vessels that the TWIC reader currently targets.34

However, USCG does not rule out expanding reader requirements in the future. Appendix III contains additional program funding details.

The TWIC reader pilot was intended to test the technology, business processes, and operational impacts of deploying TWIC readers at secure areas of the marine transportation system. Accordingly, the pilot was to test the viability of using selected biometric card readers to read TWICs within the maritime environment. It was also to test the technical aspects of connecting TWIC readers to access control systems. The results of the pilot are to inform the development of a proposed rule requiring the use of electronic card readers with TWICs at MTSA-regulated vessels and facilities.35

To conduct the TWIC reader pilot, TSA contracted with the Navy’s Space and Naval Warfare Systems Command (SPAWAR) to serve as the independent test agent to plan, analyze, evaluate, and report on all test events. Furthermore, the Navy’s Naval Air Systems Command (NAVAIR) conducted environmental testing of select TWIC readers.

36 In addition, TSA partnered with the maritime industry at 17 pilot sites distributed across seven geographic locations within the United States.37

34A notice of proposed rulemaking is published in the Federal Register and contains notices to the public of the proposed issuance of rules and regulations.

See

35Based on the August 2008 pilot initiation date, the TWIC card reader rule was to be issued no later than 24 months from the initiation of the pilot, or by August 2010, and a report on the findings of the pilot was to be submitted 4 months prior, or by April 2010. 46 U.S.C. § 70105(k). However, TSA reported that there were challenges, such as pilot participation being voluntary, encountered during the pilot that resulted in delayed reporting. 36SPAWAR is a component of the Department of the Navy. It develops and deploys advanced communications and information capabilities for the Navy and supports the full life cycle of product and services delivery, including initial research and development and acquisition services, among others. NAVAIR is housed in the Department of the Navy and provides support such as testing and evaluating systems operated by sailors and marines. 37Pilots sites were located at the following locations: (1) Annapolis, Maryland; (2) Brownsville, Texas; (3) the Port Authority of New York / New Jersey and Staten Island, New York; (4) Long Beach and Los Angeles, California; (5) Norco, Louisiana; (6) Seattle, Washington; and (7) Vicksburg, Mississippi.

TSA’s Pilot to Test Key TWIC-Related Access Control Technologies

Page 11 GAO-13-198 TWIC Reader Pilot Review

appendix IV for a complete listing of the pilot sites, locations, and types of maritime operation each represented. Levels of participation varied across the pilot sites. For example, at one facility, one pedestrian turnstile was tested out of 22 identified entry points. At another, the single vehicle gate was tested, but none of the seven pedestrian gates were tested. At a third facility with three pedestrian gates and 36 truck lanes, all three turnstiles and 2 truck lanes were tested. According to TSA, given the voluntary nature of the pilot, levels of participation varied across the pilot sites, and TSA could not dictate to the respective facilities and vessels specific and uniform requirements for testing.

The TWIC reader pilot, as initially planned, was to consist of three sequential assessments, with the results of each assessment intended to inform the subsequent ones. Table 1 highlights key aspects of the three assessments.

Table 1: Three Assessments Planned for the Transportation Worker Identification Credential (TWIC) Reader Pilot

Test name Description Initial technical test (ITT) This assessment is laboratory based and designed to determine if selected biometric card

readers meet TWIC card reader specifications, which include technical (including functional) and environmental requirements deemed necessary for use in the harsh maritime environment.a At the completion of initial technical testing, a test report was to be developed to prioritize all problems with readers based on their potential to adversely impact the maritime transportation facility or vessel. On the basis of this assessment, readers with problems that would severely impact maritime operations were not to be recommended for use in the next phase of testing.

Early operational assessment (EOA) This assessment was to serve as an initial evaluation of the impact of TWIC reader implementation on the flow of commerce. Key results to be achieved as part of this assessment included obtaining essential data to inform development of the TWIC card reader rule, assessing reader suitability and effectiveness, and further refining reader specifications. As part of this assessment, maritime transportation facilities and vessels participating in the pilot were to select the readers they plan to test and install, and test readers as part of the test site’s normal business and operational environment. The Transportation Security Administration’s objective was to include pilot test participants that are representative of a variety of maritime transportation facilities and vessels in different geographic locations and environmental conditions.

System test and evaluation (ST&E) Building on the results of the initial technical testing and the early operational assessment, the system test and evaluation was intended to evaluate the full impact of maritime transportation facility and vessel operators complying with a range of requirements anticipated to be included in the TWIC card reader rule. In addition, this evaluation was expected to establish a test protocol for evaluating readers prior to acquiring them for official TWIC implementation.

Source: GAO analysis of TSA documentation on the TWIC reader pilot. aTWIC card reader specifications were first published in September 2007 and updated on May 30, 2008.

Page 12 GAO-13-198 TWIC Reader Pilot Review

To address time and cost constraints related to using the results of the TWIC reader pilot to inform the TWIC card reader rule, two key changes were made to the pilot tests in 2008. First, TSA and USCG inserted an initial reader evaluation as the first step of the initial technical test. This evaluation was an initial assessment of each reader’s ability to read a TWIC.38 Initiated in August 2008, the initial reader evaluation resulted in a list of biometric card readers from which pilot participants could select for use in the pilot rather than waiting for the entire ITT to be completed. Further, the list of readers that passed the initial reader evaluation was used by TSA and USCG to help select a limited number of readers for full functional and environmental testing.39

Various reports were produced to document the results of each TWIC reader pilot assessment. An overall report was produced to document the ITT results conducted prior to testing at pilot sites. To document the results of testing at each of the 17 pilot sites, the independent test agent produced one EOA report and one ST&E report for each site. These reports summarized information collected from each of the pilot sites and trip reports documenting the independent test agent’s observations during visits to pilot sites.

Second, TSA did not require the TWIC reader pilot to be conducted in the sequence highlighted in table 1. Rather, pilot sites were allowed to conduct the early operational assessment and the system test and evaluation testing while ITT was under way.

On February 27, 2012, DHS conveyed the results of the TWIC reader pilot by submitting the TWIC Reader Pilot Program report to Congress. On March 22, 2013, USCG issued a notice of proposed rulemaking that would, if finalized, require owners and operators of certain MTSA-

38The initial reader evaluation is officially known as the initial capability evaluation. 39ITT full functional testing, or Functional Specification Conformance Test, was an evaluation of readers based on their ability to meet the TWIC specifications using 31 points of evaluation. As a result of this evaluation, the independent test agent was to provide a report to TSA on test metrics collected during functional testing to identify any functional or security problems related to reader performance. ITT full environmental testing, or Environmental Specification Conformance Test, included a series of tests to evaluate the card reader’s ability to operate in the expected electrical and environmental conditions that exist in the coastal ports of the United States—such as humidity, salt, fog, and dust.

Page 13 GAO-13-198 TWIC Reader Pilot Review

regulated vessels and facilities to use readers designed to work with TWICs.40

Challenges related to pilot planning, data collection, and reporting affect the completeness, accuracy, and reliability of the pilot test aimed at assessing the technology and operational impact of using TSA’s TWIC with card readers. Moreover, according to our review of the pilot and TSA’s past efforts to demonstrate the validity and security benefits of the TWIC program, the program’s premise and effectiveness in enhancing security are not supported.

As we previously reported, TSA encountered challenges in its efforts to plan the TWIC reader pilot. In November 2009, we reviewed and reported on the TWIC reader pilot design and planned approach for collecting data at pilot sites.41 For example, we reported that the pilot test and evaluation documentation did not identify how individual pilot site designs and resulting variances in the information collected from each pilot site were to be assessed. This had implications for both the technology aspect of the pilot as well as the business and operational aspect. We further reported that pilot site test designs may not be representative of future plans for using TWIC because pilot participants were not necessarily using the technologies and approaches they intend to use in the future when TWIC readers are implemented at their sites.42

4078 Fed. Reg. 17,782 (Mar. 22, 2013).

As a result, we reported that there was a risk that the selected pilot sites and test methods would not result in the information needed to understand the impacts of TWIC nationwide. At the time, TSA officials told us that no specific unit of analysis, site selection criteria, or sampling methodology

41GAO-10-43. 42Officials at two of the seven pilot sites we visited at the time told us that the technology and processes expected to be in place during the pilot would likely not be the same as will be employed in the post-pilot environment, thereby reducing the reliability of the information collected at pilot locations.

TWIC Reader Pilot Results Are Not Sufficiently Complete, Accurate, and Reliable for Informing Congress and the TWIC Card Reader Rule

Shortfalls in Planning Affected the Completeness, Accuracy, and Reliability of Data Collected during the Pilot

http://www.gao.gov/products/GAO-10-43�
Page 14 GAO-13-198 TWIC Reader Pilot Review

was developed or documented prior to selecting the facilities and vessels to participate in the TWIC reader pilot.

As a result of these challenges, we recommended that DHS, through TSA and USCG, develop an evaluation plan to guide the remainder of the pilot that includes (1) performance standards for measuring the business and operational impacts of using TWIC with biometric card readers, (2) a clearly articulated evaluation methodology, and (3) a data analysis plan. We also recommended that TSA and USCG identify how they will compensate for areas where the TWIC reader pilot will not provide the necessary information needed to report to Congress and inform the TWIC card reader rule. DHS concurred with these recommendations.

While TSA developed a data analysis plan, TSA and USCG reported that they did not develop an evaluation plan with an evaluation methodology or performance standards, as we recommended. The data analysis plan was a positive step because it identified specific data elements to be captured from the pilot for comparison across pilot sites. If accurate data had been collected, adherence to the data analysis plan could have helped yield valid results. However, TSA and the independent test agent did not utilize the data analysis plan. According to officials from the independent test agent, they started to use the data analysis plan but stopped using the plan because they were experiencing difficulty in collecting the required data and TSA directed them to change the reporting approach. TSA officials stated that they directed the independent test agent to change its collection and reporting approach because of TSA’s inability to require or control data collection to the extent required to execute the data analysis plan. However, TSA and USCG did not fully identify how they would compensate for areas where the pilot did not provide the necessary information needed to report to Congress and inform the TWIC card reader rule. For example, such areas could include (1) testing to determine the impact of the business and operational processes put in place by a facility to handle those persons that are unable to match their live fingerprint to the fingerprint template stored on the TWIC and (2) requiring operators using a physical access control system in conjunction with a TWIC to identify how they are protecting personal identify information and testing how this protection affects the speed of processing TWICs. While USCG commissioned two studies to help compensate for areas where the TWIC reader pilot will not

Page 15 GAO-13-198 TWIC Reader Pilot Review

provide necessary information,43 the studies did not compensate for all of the challenges we identified in our November 2009 report.44

In addition, our review of the TWIC reader pilot approach as implemented since 2009 and resulting pilot data identified some technology issues that affected the reliability of the TWIC reader pilot data. As DHS noted in its report to Congress, successful implementation of TWIC readers includes the development of an effective system architecture and physical access control system and properly functioning TWIC cards, among other things.

Such challenges included, for example, the impact of adding additional security protection on systems to prevent the disclosure of personal identity information and the related cost and processing implications.

45

TSA and independent test agent summary test results note that ambiguities within the TWIC card reader specification—the documented requirements for what and how TWIC card readers are to function—may have led to different interpretations and caused failures of tested TWIC systems. According to TSA, the readers that underwent laboratory-based

However, not all TWIC card readers used in the TWIC reader pilot underwent both environmental and functional tests in the laboratory prior to use at pilot sites as originally intended. Because of cost and time constraints, TSA officials instead conducted an initial evaluation of all readers included in the pilot to determine their ability to read a TWIC. These initial evaluations resulted in a list of 30 biometric TWIC card readers from which pilot participants could select a reader for use. However, of these 30 readers, 8 underwent functional testing and 5 underwent environmental testing. None of the TWIC card readers underwent and passed all tests.

43Systems Planning and Analysis, Inc. Survey of Physical Access Control System Architectures, Functionality, Associated Components, and Cost Estimates, a report prepared for the United States Coast Guard Office of Standards Evaluation and Development (CG-523), (Alexandria, Virginia: March 31, 2011). Booz, Allen, Hamilton. Port Facility Congestion Study, United States Coast Guard, a report prepared for the United States Coast Guard, (McLean, Virginia: February 16, 2011). 44GAO-10-43. 45For a TWIC-based access control system, system architecture refers to the selection, placement, and integration of systems required to make a decision about granting or denying access. Components of a TWIC-based access control system architecture may include, for example, the card readers, other systems or databases used (where needed) to make access control decisions, and the connectivity between the card readers and other systems or databases.

http://www.gao.gov/products/GAO-10-43�
Page 16 GAO-13-198 TWIC Reader Pilot Review

environmental and functional testing and were placed on the TSA list of acceptable readers did not have problems that would severely impact pilot site operations or prevent the collection of useful pilot data and therefore the readers were all available for use during the pilot. However, according to our review of the pilot documentation, TSA did not define what “severely impact” meant or performance thresholds for reader problems identified during laboratory-based environmental and functional testing that would severely impact pilot site operations or prevent the collection of useful pilot data. Further, according to TSA officials, TSA could not eliminate 1 of the readers that may have failed a test from the list of acceptable readers when other readers that had not been tested would be allowed on the list. According to TSA officials, doing so would have been an unfair disadvantage to the readers that were selected for the more rigorous laboratory-based environmental and functional testing. In addition, TSA did not provide pilot sites with the results of the laboratory-based environmental and functional testing. According to TSA, it signed confidentiality agreements with reader vendors, which prevented it from sharing this information. The results could have been used to help inform each pilot site’s selection of readers appropriate for its organization’s environmental and operational considerations. This may have hindered TSA’s efforts to determine if issues observed during the pilot were due to the TWIC, TWIC reader, or a facility’s access control system. Nonetheless, TSA determined that information collected during reader laboratory-based testing and at pilot sites was still useful for refining future TWIC reader specifications.

In addition, while TWIC cards are intended for use in the harsh maritime environment, the finalized TWIC cards did not undergo durability testing prior to testing at pilot sites. TSA selected card stock that had been tested in accordance with defined standards.46

46Card stock is a blank card that includes physical characteristics such as the antenna and computer chip. Card stock is used to manufacture TWIC cards. The card stock used by the TWIC program has been evaluated by the General Services Administration’s Federal Information Processing Standards (FIPS) 201 Evaluation Program to determine whether the card stock meets federal standards. These standards include various durability tests of blank card stock prior to approving the card stock for placement on the General Services Administration’s list of products approved for use by federal agencies.

However, TSA did not conduct durability tests of the TWIC cards after they were personalized with

Page 17 GAO-13-198 TWIC Reader Pilot Review

security features, such as the TWIC holder’s picture, or laminated.47 According to TSA, technology reasons that may render a TWIC card damaged include, among others, breakage to the antenna or the antenna’s connection to the card’s computing chip.48

The importance of durability testing has been recognized by other government agencies and reported by GAO as a means to identify card failures before issuance. For example, the Department of Defense’s (DOD) common access card—also used in harsh environments such as Afghanistan and other areas with severe weather conditions—has, according to DOD officials, been tested after personalization to ensure that it remains functional and durable.

Without testing the durability of personalized TWIC cards, the likelihood that TWIC cards and added security features can withstand the harsh maritime environment is unknown. According to TWIC program officials, each TWIC is tested to ensure it functions prior to being issued to an individual. However, the finalized TWIC card was not tested for durability to ensure that it could withstand the harsh maritime environment because doing so would be costly; TWIC is a fee-funded program, and the officials believed it would be unfair to pass on the cost to further test TWICs to consumers. However, testing TWIC credentials to ensure they can withstand the harsh maritime environment may prove to be more cost-effective, as it could minimize the time lost at access points and the TWIC holder’s need to pay a $60 replacement fee if the TWIC were to fail.

49

47As we reported in April 2011, although not required to comply with FIPS-201, (Personal Identity Verification [PIV]) of Federal Employees and Contractors), as a policy decision, DHS and TSA decided to align the TWIC program with FIPS-201 standards where possible. To satisfy FIPS-201 security and technical interoperability requirements, a PIV card must be personalized and include identity information for the individual to whom the card is issued. Further, it must be free of defects such as fading and discoloration, not impede access to machine-readable information, and meet durability requirements.FIPS- 201 requires durability tests to evaluate card material durability and performance. Card durability can be affected by what is added to the card upon completion of personalization. For example, adding laminated card finishes and security features may affect the durability of the finished card.

DOD also assesses returned nonfunctioning common access cards to identify the potential cause of card failures. In addition, in June 2010, as part of our review of another

48The antenna is the piece of technology needed for a contactless reader to communicate with a TWIC. 49DOD’s common access card is an identification card used by active-duty military personnel, DOD civilian employees, and eligible contractor personnel.

Page 18 GAO-13-198 TWIC Reader Pilot Review

credential program, we recommended that the Department of State fully test or evaluate the security features on its Border Crossing Cards, including any significant changes made to the cards’ physical construction, security features, or appearance during the development process.50

As a result of the noted planning and preparation shortfalls, including (1) the absence of defined performance standards for measuring pilot performance, (2) variances in pilot site testing approaches without compensating measures to ensure complete and comparable data were collected, and (3) inadequate testing to ensure that piloted readers and TWICs worked as intended, the data TSA and the independent test agent collected on the technology and operational impacts of using TWIC at pilot sites were not complete, accurate, and reliable.

Thus, durability testing TWIC cards after personalization could have reduced the pervasiveness of problems encountered with malfunctioning TWIC cards during the pilot.

In addition to the pilot planning challenges discussed above, we found that the data collected through the pilot are also not generalizable because of certain pilot implementation and data collection practices we identified.51

50See GAO. Border Security: Improvements in the Department of State’s Development Process Could Increase the Security of Passport Cards and Border Crossing Cards,.

As required by the SAFE Port Act of 2006, the pilot was to

GAO-10-589 (Washington, D.C.: June 1, 2010). We reported that the Department of State (State) tested and evaluated the security of prototypes of the passport card, which did not include key features such as the background artwork, personalization features, and other security features that were added or changed for the final passport card. We recommended that State fully test or evaluate the security features on the cards as they will be issued, including any significant changes made to the cards’ physical construction, security features, or appearance during the development process. State concurred and reported taking actions to address the recommendation. 51Collected operational and performance data cannot be generalized across each pilot site or nationally across all pilot sites where use of TWIC will be required because neither the pilot sites nor access points tested were selected randomly. According to USCG officials, USCG believes that cost data derived from the pilot can be extrapolated. However, on the basis of our analysis and review of pilot data, given the limitations of collected cost data, including that pilot sites did not necessarily implement readers and associated access control systems as they intend in the future, use of cost data derived from the pilot should be limited and used with caution, if at all. Reliable data on the pervasiveness of TWIC card issues, access control systems erroneously preventing access to facilities, queues at access points, and ongoing reader and related access control system operations and maintenance costs, among others, are needed to reliably determine the economic cost impact of using TWIC with readers.

Data Collection Challenges Were Encountered during the TWIC Reader Pilot

http://www.gao.gov/products/GAO-10-589�
Page 19 GAO-13-198 TWIC Reader Pilot Review

test the technology and operational impacts of deploying transportation security card readers at secure areas of the marine transportation system. In addition, as set forth in the TWIC test and evaluation master plan, the TWIC reader pilot was to provide accurate and timely information necessary to evaluate the economic impact of a nationwide deployment of TWIC card readers at over 16,400 MTSA-regulated facilities and vessels, and was to be focused on assessing the use of TWIC readers in contactless mode.52

Table 2: Weaknesses in the Transportation Worker Identification Credential (TWIC) Reader Pilot Affecting the Completeness, Accuracy, and Reliability of Data Collected

However, data were collected and recorded in an incomplete and inconsistent manner during the pilot, further undermining the completeness, accuracy, and reliability of the data collected at pilot sites. Table 2 presents a summary of TWIC reader pilot data collection and supporting documentation reporting weaknesses that we identified that affected the completeness, accuracy, and reliability of the pilot data, which we discuss in further detail below.

  1. Installed TWIC readers and access control systems could not collect required data on TWIC reader use, and TSA and the independent test agent did not employ effective compensating data collection measures.
  2. Reported transaction data did not match underlying documentation. 3. Pilot documentation did not contain complete TWIC reader and access control

system characteristics. 4. Transportation Security Administration (TSA) and the independent test agent did not

record clear baseline data for comparing operational performance at access points with TWIC readers.

  1. TSA and the independent test agent did not collect complete data on malfunctioning TWIC cards.
  2. Pilot participants did not document instances of denied access. 7. TSA and the independent test agent did not collect consistent data on the

operational impact of using TWIC cards with readers. 8. Pilot site reports did not contain complete information about installed TWIC readers’

and access control systems’ design.

Source: GAO.

52U.S. Department of Homeland Security, Transportation Security Administration. Transportation Worker Identification Credential (TWIC) Contactless Biometric Card and Reader Capability Pilot Test, Test and Evaluation Master Plan (TEMP), approved December 2007. As used in this report, contactless mode refers to the use of TWIC readers for reading TWIC cards without requiring that a TWIC card be inserted into or make physical contact with a TWIC reader.

Page 20 GAO-13-198 TWIC Reader Pilot Review

  1. Installed TWIC readers and access control systems could not collect required data on TWIC reader use, and TSA and the independent test agent did not employ effective compensating data collection measures. The TWIC reader pilot test and evaluation master plan recognizes that in some cases, readers or related access control systems at pilot sites may not collect the required test data, potentially requiring additional resources, such as on-site personnel, to monitor and log TWIC card reader use issues. Moreover, such instances were to be addressed as part of the test planning. However, the independent test agent reported challenges in sufficiently documenting reader and system errors. For example, in its monthly communications with TSA, the independent test agent reported that the logs from the TWIC readers and related access control systems were not detailed enough to determine the reason for errors, such as biometric match failure, an expired TWIC card, or that the TWIC was identified as being on the list of revoked credentials. The independent test agent further reported that the inability to determine the reason for errors limited its ability to understand why readers were failing, and thus it was unable to determine whether errors encountered were due to TWIC cards, readers, or users, or some combination thereof. As a result, according to the independent test agent, in some cases the TWIC readers and automated access control systems at various pilot sites were not capable of collecting the data required to assess pilot results. According to the independent test agent, this was primarily due to the lack of reader messaging standards—that is, a set of standard messages readers would display in response to each transaction type. Some readers used were newly developed by vendors, and some standards were not defined, causing inconsistencies in the log capabilities of some readers.53

According to TSA officials, TSA allowed pilot participants to select their own readers and related access control systems and audit logs.

The independent test agent noted that reader manufacturers and system integrators—or individuals or companies that integrate TWIC-related systems—were not all willing to alter their systems’ audit logs to collect the required information, such as how long a transaction might take prior to granting access. Both TSA and the independent test agent agree that this issue limited their ability to collect the data needed for assessing pilot results.

53The independent test agent could not provide an exact count of the readers with log capability inconsistencies.

Page 21 GAO-13-198 TWIC Reader Pilot Review

Consequently, TSA could not require that logs capable of meeting pilot data collection needs be used. In addition, TSA officials noted that determining the reason for certain errors, such as biometric match failures, could be made only while the independent test agent was present and had the time and ability to investigate the reason that a TWIC card had been rejected by a reader for access. On average, the independent test agent visited each pilot participant seven times during the early operational assessment and system test and evaluation testing period. TSA further noted that the development or use of alternative automated data collection methods would have been costly and would have required integration with the pilot site’s system. However, given that TSA was aware of the data needed from the pilot sites prior to initiating testing and the importance of collecting accurate and consistent data from the pilot, proceeding with the pilot without implementing adequate compensating mechanisms for collecting requisite data or adjusting the pilot design accordingly is inconsistent with the basic components of effective evaluation design and renders the results less reliable.

  1. Reported transaction data did not match underlying documentation. A total of 34 pilot site reports were issued by the independent test agent.54 According to TSA, the pilot site reports were used as the basis for DHS’s report to Congress. We separately requested copies of the 34 pilot site reports from both TSA and the independent test agent. In comparing the reports provided, we found that 31 of the 34 pilot site reports provided to us by TSA did not contain the same information as those provided by the independent test agent.55

54The independent test agent was to conduct two phases of testing—EOA and ST&E—at the 17 pilot sites. The independent test agent issued 2 pilot site reports for each of the 17 pilot sites—1 based on EOA tests and the other based on ST&E tests—for a total of 34 pilot site reports.

Differences for 27 of the 31 pilot site reports pertained to how pilot site data were characterized, such as the baseline throughput time used to compare against throughput times observed during two phases of testing: early operational assessment and systems test and evaluation. For example, TSA inserted USCG’s 6-second visual inspection estimate as the baseline throughput time measure for all pilot site access points in its amended pilot site reports instead of the actual throughput time collected and reported by the independent test agent during baseline data

55 In total, we reviewed 68 reports; 34 provided by TSA and 34 provided by the independent test agent.

Page 22 GAO-13-198 TWIC Reader Pilot Review

collection efforts.56 However, at two pilot sites, Brownsville and Staten Island Ferry, transaction data reported by the independent test agent did not match the data included in TSA’s reports. For example, of the 15 transaction data sets in the Staten Island Ferry ST&E report, 10 of these 15 data sets showed different data reported by TSA and the independent test agent. These differences were found in the weekly transactions and the sum total of valid and invalid transactions.57

According to TSA officials, it used an iterative process to review and analyze pilot data as the data became available to it from the pilot participant sites. In addition, TSA officials noted that the independent test agent’s reports were modified in order to “provide additional context” and consistent data descriptions, and to present data in a more usable or understandable manner. Specifically, according to TSA officials, they and USCG officials believed that they had more knowledge of the data than the independent test agent and there was a need, in some cases, for intervening and changing the test reports in order to better explain the data. USCG officials further noted that the independent test agent’s draft reports were incomplete and lacked clarity, making revisions necessary to make the information more thorough. TSA also reported that it inadvertently used an earlier version of the report and not the final September 2011 site reports provided by the independent test agent to prepare the report to Congress.

In addition to differences found in the EOA and ST&E pilot site reports, we found differences between the data recorded during the independent test agent’s visits to pilot sites versus data reported in the EOA and ST&E

56According to TSA, it applied the 6-second baseline for visually inspecting a TWIC to account for the artificially short measured inspection times at some facilities where security personnel were observed allowing access without completing the three-step visual verification process required by USCG regulations: (1) checking the card expiration date, (2) comparing the photo on the card against the worker presenting the card, and (3) examining one of the security features on the card. USCG and TSA concurred that a minimum of 6 seconds is required to complete a compliant visual inspection. In addition, the 6-second baseline was also inserted at sites where the recorded baseline was longer than 6 seconds. 57In addition to discrepancies observed in recorded transaction data for the Brownsville and Staten Island Ferry pilot sites, TSA did not collect transaction data at two pilot sites during the ST&E phase or throughput data at three pilot sites. The ST&E phase was intended to evaluate the full impact of facility and vessel operators complying with a range of anticipated identity verification requirements to be established through the TWIC reader rule.

Page 23 GAO-13-198 TWIC Reader Pilot Review

pilot site reports. Data recorded during the independent test agent’s visits to pilot sites in trip reports were to inform final pilot site reports. The independent test agent produced 76 trip reports containing throughput data. We examined 34 of the 76 trip reports and found that all 34 trip reports contained data that were excluded or did not match data reported in the EOA and ST&E pilot site reports completed by the independent test agent. According to the independent test agent, the trip reports did not match the EOA and ST&E pilot site reports because the trip reports contained raw data that were analyzed and prepared for presentation in the participant EOA and ST&E pilot site reports. However, this does not explain why data reported by date in trip reports do not match related data in the EOA and ST&E pilot site reports. Having inconsistent versions of final pilot site reports, conflicting data in the reports, and data excluded from final reports without explanation calls into question the accuracy and reliability of the data.

  1. Pilot documentation did not contain complete TWIC reader and access control system characteristics. Pilot documentation did not always identify which TWIC readers or which interface (e.g., contact or contactless interface) the reader used to communicate with the TWIC card during data collection. For example, at one pilot site, two different readers were tested. However, the pilot site report did not identify which data were collected using which reader. Likewise, at pilot sites that had readers with both a contact and a contactless interface, the pilot site report did not always identify which interface was used during data collection efforts. According to TSA officials, sites were allowed to determine which interface to use based on their business and operational needs. According to the independent test agent, it had no control over what interface pilot sites used during testing if more than one option was available. Consequently, pilot sites could have used the contactless interface for some transactions and the contact interface for others without recording changes. The independent test agent therefore could not document with certainty which interface was used during data collection efforts. Without accurate documentation of information such as this, an assessment of TWIC reader performance based on interface cannot be determined. This is a significant data reliability issue, as performance may vary depending on which interface is used, and in accordance with the TWIC reader pilot’s test and evaluation master plan, use of the contactless interface was a key element to be evaluated during the pilot.
  2. TSA and the independent test agent did not record clear baseline data for comparing operational performance at access points with

Page 24 GAO-13-198 TWIC Reader Pilot Review

TWIC readers. Baseline data, which were to be collected prior to piloting the use of TWIC with readers, were to be a measure of throughput time, that is, the time required to inspect a TWIC card and complete access- related processes prior to granting entry. This was to provide the basis for quantifying and assessing any TWIC card reader impacts on the existing systems at pilot sites.58

  1. TSA and the independent test agent did not collect complete data on malfunctioning TWIC cards. TSA officials observed malfunctioning TWICs during the pilot, largely because of broken antennas. The antenna is the piece of technology needed for a contactless reader to communicate with a TWIC. If a TWIC with a broken antenna was presented for a contactless read, the reader would not identify that a TWIC had been presented, as the broken antenna would not communicate TWIC information to a contactless reader. In such instances, the reader would not log that an access attempt had been made and failed. Individuals holding TWICs with bad antennas had presented their TWICs at contactless readers; however, the readers did

Pilot documentation shows that baseline throughput data were collected for all pilot sites. However, it is unclear from the documentation whether acquired data were sufficient to reliably identify throughput times at truck, other vehicle, and pedestrian access points, which may vary. It is further unclear whether the summary baseline throughput data presented are based on a single access point, an average from all like access points, or whether the data are from the access points that were actually tested during later phases of the pilot. Further complicating the analysis of baseline data is that there was a TSA version of the baseline report and a separate version produced by the independent test agent, and facts and figures in each do not fully match. Where both documents present summary baseline throughput data for each pilot site, the summary baseline throughput data differ for each pilot site. For example, summary baseline throughput data at one pilot site is reported as 4 minutes and 10 seconds in one version of the report but is reported as 47 seconds in the other report. As a result, the accuracy and reliability of the available baseline data are questionable. Further, according to TSA, where summary throughput data were not included in the baseline report, the independent test agent’s later site reports did contain the data.

58Baseline data were to include, among other things, throughput times, the number and type of access points at a pilot site, the transactions (traffic) through each access point, and the populations accessing these points prior to TWIC reader installation.

Page 25 GAO-13-198 TWIC Reader Pilot Review

not document and report each instance that a malfunctioning TWIC was presented. Instead, as noted by pilot participants and confirmed by TSA officials, pilot sites generally conducted visual inspections when confronting a malfunctioning TWIC and granted the TWIC holder access. While in some cases the independent test agent used a card analysis tool to assess malfunctioning TWICs, TSA officials reported that neither they nor the independent test agent documented the overall number of TWICs with broken antennas or other damage. According to TSA officials, the number of TWICs with broken antennas or other damage was not tracked because failed TWIC cards could be tracked only if an evaluator was present, had access to a card analysis tool, and had the cooperation of the pilot participants to hold up a worker’s access long enough to confirm that the problem was the TWIC card and not some other factor. However, it is unclear why TSA was unable to provide a count of TWICs with broken antennas or other damage based on the TWIC cards that were analyzed with the card analysis tool.

While TSA could not provide an accounting of TWICs with broken antennas or other damage experienced during the pilot, pilot participants and other data collected provide additional context and perspective for understanding the nature and extent of TWIC card failure rates during the pilot. Officials at one pilot container facility told us that a 10 percent failure rate would be unacceptable and would slow down cargo operations. However, according to officials from two pilot sites, approximately 70 percent of the TWICs they encountered when testing TWICs against contactless readers had broken antennas or malfunctioned. Further, a separate 2011 report commissioned and led by USCG identified problems with reading TWICs in contactless mode during data collection.59 This report identified one site where 49 percent of TWICs could not be read in contactless (or proximity60

59Systems Planning and Analysis, Inc. Survey of Physical Access Control System Architectures, Functionality, Associated Components, and Cost Estimates, prepared for the United States Coast Guard Office of Standards Evaluation and Development (CG- 523), (Alexandria, Virginia: March 31, 2011).

) mode, and two other sites where 11 percent and 13 percent of TWICs could not be read in contactless mode. Because TWIC cards malfunctioned, they could not be detected by readers. Accordingly, individuals may have made multiple attempts to get the TWIC reader to read the TWIC card; however, each attempt was not

60As used in this report, reading a card in proximity mode is the same as reading a card in contactless mode.

Page 26 GAO-13-198 TWIC Reader Pilot Review

recorded and thus TSA does not have an accurate accounting of the number of attempts or time it may have taken to resolve resulting access issues. Consequently, assessments of the operational impacts of using TWIC with readers using the collected data alone should be interpreted cautiously as they may be based on inaccurate data.

In discussing these failure rates with TSA officials, the officials reported that TSA does not have a record of a pilot participant reporting a 70 percent failure rate.61

  1. Pilot participants did not document instances of denied access. Incomplete data resulted from challenges documenting how to manage individuals with a denied TWIC across pilot sites. The independent test agent reported that facility security personnel were unclear on how to process people who are denied access by a TWIC reader because of a biometric mismatch or other TWIC card issue. In these cases, pilot site officials would need to receive input from USCG as to whether to grant or deny access to an individual presenting a TWIC card that had been denied. According to TSA officials, during the pilot, if a TWIC reader denied access to a TWIC, the facility could visually inspect the TWIC, as allowed under current regulation, and grant the individual access. However, TSA and the independent test agent did not require pilot participants to document when individuals were granted access based on

In addition, they believe that the failure rates reported by pilot sites and the separate USCG-commissioned report are imperfect because they did not have the card analysis tool necessary to confirm a failed TWIC card, and instances where a failed TWIC card was presented at a pilot site could be documented only when the independent test agent was present at the site with a card analysis tool. However, a contractor from TSA visited the facility where the USCG report notes that 49 percent of TWICs could not be read in contactless mode and found that 60 out of 110 of TWIC cards checked, or 54.5 percent, would not work in contactless mode. TSA officials agreed that TWIC card failure rates were higher than anticipated and stated that TSA is continuing to assess TWIC card failures to identify the root cause of the failures and correct for them. TSA is also looking to test the TWIC cards at facilities that have not previously used TWIC readers to get a better sense of how inoperable TWIC cards might affect a facility operationally.

61According to TSA officials, workers were not required to replace malfunctioning cards during the pilot. Therefore, a worker could present the same malfunctioning card to a reader upon each entry to a facility.

Page 27 GAO-13-198 TWIC Reader Pilot Review

a visual inspection of the TWIC, or deny the individual access as may be required under future regulation. This is contrary to the TWIC reader pilot test and evaluation master plan, which calls for documenting the number of entrants “rejected” with the TWIC card reader system operational as part of assessing the economic impact. Without such documentation, the pilot sites were not completely measuring the operational impact of using TWIC with readers.

  1. TSA and the independent test agent did not collect consistent data on the operational impact of using TWIC cards with readers. TWIC reader pilot testing scenarios included having each individual present his or her TWIC for verification; however, it is unclear whether this actually occurred in practice. For example, at one pilot site, the independent test agent did not require each individual to have his or her TWIC checked during throughput data collection.62

62Throughput data include the timing of the approach, credential check and clearance (if applicable), and physical movement through the point of entry. For the TWIC reader pilot, throughput data collection for truck and vehicle traffic was to begin once the truck or vehicle came to a complete stop at the access point and end once the truck or vehicle pulled away from the access point. The timing of pedestrian throughput may be calculated from the time an individual presents his or her TWIC or when the person comes within 2 feet of the access point (depending on the access process) and end once the access point is ready to receive the next entrant.

Officials at the pilot site noted that during testing, approximately 1 in 10 individuals was required to have his or her TWIC checked while entering the facility because of concerns about causing a traffic backup. They said that this approach was used because pilot site officials believed that reading each TWIC would have caused significant congestion. However, the report for the pilot site does not note this selective use of the TWIC card. In addition, officials from another pilot site reported that truck drivers could elect to go to other lanes that were not being monitored during throughput time collection. Officials at this pilot site noted that truck drivers, observing congestion in lanes where throughput time was being collected, used other lanes to avoid delays. This was especially the case when the tested truck lane was blocked to troubleshoot TWIC card and reader problems. However, the pilot site report did not record congestion issues or the avoidance of congestion issues by allowing truck drivers to use alternative lanes where TWIC readers were not being tested. TSA officials also noted that another pilot site would allow trucks entry without using a TWIC reader on an ad hoc basis during the pilot to prevent congestion, making it difficult to consistently acquire the data needed to

Page 28 GAO-13-198 TWIC Reader Pilot Review

accurately assess the operational impacts, such as the truck congestion resulting from TWIC cards with readers. Despite the noted deviations in test protocols, the reports for these pilot sites do not note that these deviations occurred.

In commenting on this issue, TSA officials noted that these deviations occurred most frequently at those facilities with multiple truck or pedestrian access points where readers were installed at a few access points. Most commonly these facilities were large container terminals. Because of the voluntary nature of the pilot, TSA elected to primarily use reader performance data from facilities that did not install and use readers at all access points. TSA officials further noted that the impact of readers on operations at these facilities necessarily was discounted in the final report to Congress. However, pilot documentation shows that container terminals held the largest population of individuals potentially requiring the use of a TWIC. Noting deviations such as those described above in each pilot site report would have provided important perspective by identifying the limitations of the data collected at the pilot site and providing context when comparing the pilot site data with data from other pilot sites. Further, identifying the presence of such deviations could have helped the independent test agent and TSA recognize the limitations of the data when using them to develop and support conclusions for the pilot report on the business and operational impact of using TWICs with readers.

  1. Pilot site reports did not contain complete information about installed TWIC readers’ and access control systems’ design. TSA and the independent test agent tested the TWIC readers at each pilot site to ensure they worked before individuals began presenting their TWIC cards to the readers during the pilot. As part of this test, information on how each TWIC reader communicated with TWICs and related access control systems was to be documented. In accordance with TWIC test plans, this testing was to specify, among other things, whether the TWIC reader (1) was contactless or required contact with a TWIC, (2) communicated with a facility’s physical access control system(s) through a wired or wireless conduit, or (3) granted or denied access to a TWIC holder itself or relied on a centralized access system to make that determination. However, the data gathered during the testing were incomplete. For example, 10 of 15 sites tested readers for which no

Page 29 GAO-13-198 TWIC Reader Pilot Review

record of system design characteristics were recorded.63

As we have previously reported, the basic components of an evaluation design include identifying information sources and measures, data collection methods, and an assessment of study limitations, among other things.

In addition, pilot reader information was identified for 4 pilot sites but did not identify the specific readers or associated software tested. Further, 1 pilot site report included reader information for another pilot site and none for its own. This limited TSA’s ability to assess performance results by various reader and access control system characteristics. The absence of this information is particularly important, as it was the only source of data recorded at pilot sites where reader and operational throughput performance could be assessed at a level of granularity that would allow for the consideration of the array of reader, system design, and entry process characteristics. According to TSA officials, collecting these data was the independent test agent’s responsibility, but the independent test agent did not record and provide all required data. The independent test agent maintains that the data are present. However, we reviewed the documentation, and we did not find the data.

64 We further reported that care should be taken to ensure that collected data are sufficient and appropriate,65 and that measures are incorporated into data collection to ensure that data are accurate and reliable.66 Data may not be sufficiently reliable if (1) significant errors or incompleteness exists in some of or all the key data elements,67 and (2) using the data would probably lead to an incorrect or unintentional message.68

63The reported figures exclude Brownsville and Staten Island Ferry pilot sites because of the extent of reporting deficiencies identified in reported data for these sites.

Moreover, in accordance with Standards for Internal Control

64GAO-12-208G. 65Sufficiency refers to the quantity of evidence. Appropriateness refers to the relevance, validity, and reliability of the evidence in supporting the evaluation objectives. 66Accuracy refers to the extent that recorded data reflect the actual underlying information. Consistency, a subcategory of accuracy, refers to the need to obtain and use data that are clear and well defined enough to yield similar results in similar analyses. For example, if data are entered at multiple sites, inconsistent interpretation of data entry rules can lead to data that, taken as a whole, are unreliable. 67Completeness refers to the extent that relevant records are present and the fields in each record are populated appropriately. 68GAO-09-680G.

http://www.gao.gov/products/GAO-12-208G�
http://www.gao.gov/products/GAO-09-680G�
Page 30 GAO-13-198 TWIC Reader Pilot Review

in the Federal Government, controls are to be designed to help ensure the accurate and timely recording of transactions and events. Properly implemented control activities help to ensure that all transactions are completely and accurately recorded.69

According to TSA, a variety of challenges prevented TSA and the independent test agent from collecting pilot data in a complete and consistent fashion. Among the challenges noted by TSA, (1) pilot participation was voluntary, which allowed pilot sites to stop participation at any time or not adhere to established testing and data collection protocols; (2) the independent test agent did not correctly and completely collect and record pilot data; (3) systems in place during the pilot did not record all required data, including information on failed TWIC card reads and the reasons for the failure; and (4) prior to pilot testing, officials did not expect to confront problems with nonfunctioning TWIC cards. Additionally, TSA noted that it lacked the authority to compel pilot sites to collect data in a way that would have been in compliance with federal standards. In addition to these challenges, the independent test agent identified the lack of a database to track and analyze all pilot data in a consistent manner as an additional challenge to data collection and reporting. The independent test agent, however, noted that all data collection plans and resulting data representation were ultimately approved by TSA and USCG. However, our review of pilot test results shows that because the resulting pilot data are incomplete, inaccurate, and unreliable, they should not be used to help inform the card reader rule. While TSA’s stated challenges may have hindered TWIC reader pilot efforts, planning and management shortfalls also resulted in TWIC reader pilot data being incomplete, inaccurate, and unreliable. The challenges TSA and the independent test agent confronted during the pilot limited their data collection efforts, which were a critical piece of the assessment

Having measures in place to ensure collected data are complete, are not subject to inappropriate alteration, and are collected in a consistent manner helps ensure that data are accurate and reliable. However, as discussed in the examples above, TSA and the independent test agent did not take the steps needed to ensure the completeness, accuracy, and reliability of TWIC reader data collected at pilot sites, and the pilot lacked effective mechanisms for ensuring that transactions were completely and consistently recorded.

69GAO/AIMD-00-21.3.1.

http://www.gao.gov/products/GAO/AIMD-00-21.3.1�
Page 31 GAO-13-198 TWIC Reader Pilot Review

of the technology and operational impacts of using TWIC at pilot sites that were to be representative of actual deployment conditions.

As required by the SAFE Port Act and the Coast Guard Authorization Act of 2010, DHS’s report to Congress on the TWIC reader pilot presented several findings with respect to technical and operational aspects of implementing TWIC technologies in the maritime environment. DHS reported the following, among other findings:

  1. Despite facing a number of challenges, the TWIC reader pilot obtained sufficient data to evaluate reader performance and assess the impact of using readers at ports and maritime facilities.
  2. A biometric match may take longer than a visual inspection alone but not long enough to cause access point throughput delays that would negatively impact business operations.
  3. When designed, installed, and operated in manners consistent with the business considerations of the facility or vessel operation, TWIC readers provide an additional layer of security by reducing the risk that an unauthorized individual could gain access to a secure area.

In addition, the report noted a number of lessons learned. For example, TWIC cards were found to be sensitive to wet conditions, and users experienced difficulty reading messages on the screens of readers not shielded from direct sunlight, which prevented users from determining the cause of access denial, among other things. According to officials from TSA and DHS’s Screening Coordination Office, many of these lessons learned did not require a pilot in order to be identified, but the pilot did make a positive contribution by helping to validate these lessons learned. Additionally, officials from DHS’s Screening Coordination Office noted that they believe that the report to Congress included a comprehensive

Issues with DHS’s Congressional Report on the Pilot and the Validity of the TWIC Security Premise Raise Concerns about the Effectiveness of the TWIC Program

DHS’s Report to Congress Presented Findings and Lessons Learned That Were Not Always Supported by the Collected Data

Page 32 GAO-13-198 TWIC Reader Pilot Review

listing of the extent to which established metrics were achieved during the pilot program, as required by the Coast Guard Authorization Act of 2010.

However, according to our review, the findings and lessons learned in DHS’s report to Congress were based on incomplete or unreliable data, and thus should not be used to inform the development of the future regulation on the use of TWIC with readers. Specifically, incomplete TWIC cost data and unreliable access point throughput time data result in an inaccurate description of the impact of TWIC on MTSA-regulated facilities and vessels. Further, data on the security benefits of TWIC were not collected as part of the pilot and therefore the statements made in DHS’s report to Congress are not supported by the pilot data.

Reported Costs

DHS’s report identified costs for implementing TWIC readers during the pilot. However, the costs reported by DHS do not represent the full costs of operating and maintaining TWIC readers and related systems within a particular site, or the cost of fully implementing TWIC at all sites. First, DHS’s reported costs for implementing TWIC with readers during the pilot did not consistently reflect the costs of implementing TWIC at all access points needed for each facility. For example, DHS’s report correctly notes that 2 container facilities did not implement TWIC readers at all access points and are therefore not reflective of full implementation. However, on the basis of our analysis and interviews with pilot site officials, at least 5 of the remaining pilot sites would need to make additional investments in readers, totaling 7 pilot sites requiring investments beyond reported expenditures. For example, officials at 2 pilot sites told us that they would need to invest in and install additional readers if reader use was required by regulation. Officials at 3 pilot sites told us that their investment in TWIC readers during pilot testing was not representative of how they would invest in TWIC if regulation required that an individual’s TWIC be checked with a reader at each entry.70

70Specifically, 2 of 3 pilot sites tested portable readers alone, which required a limited investment, but would install fixed readers if readers were required by regulation to better address their needs. The security official from the third pilot site that primarily tested stand-alone fixed readers told us that his preference, if reader use is required, would be to use networked readers connected to a centralized access control system for making access determinations.

The post Report to Congressional Committees appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

Physical Security About Locks

Physical Security About Locks

Please look around you at work and do not tell me the name of the organization, but please tell me the types of locks that are present around you. Essentially you are making a list of the types of the locks that are around you. After you have compiled a list of locks, please use the course text and any other relevant literature that you can find to describe, how this type of lock can be compromised. Please note that the main body of the document needs to be at least one page.

1)use of locks in the physical crime prevention

objective: key operated mechanism , combination locks,lock bodies and door locks,attacks and countermeasures ,locks and system approach to security

2)safe ,vaults and Accessories

Reference for an idea

[youtube https://www.youtube.com/watch?v=RqTPy4oukzI?feature=oembed&w=1200&h=675]

The post Physical Security About Locks appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

What-If Analysis Excel Computer Data

What-If Analysis Excel Computer Data

Download the files instructions and starting file for this project and complete all steps. Use the images provided to verify that your results are correct.

Documentation
Ralston County Youth Conference Analysis
Author:
Date:
Purpose:
Proposal
Ralston County
Youth Conference
Expenses (millions) Year Startup Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Year 11 Year 12 Year 13 Year 14 Year 15 Year 16 Year 17 Year 18 Year 19 Year 20 Shut-down TOTALS
Lease
Building Construction
Building Operations
TOTALS
Income (millions) Year Startup Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Year 11 Year 12 Year 13 Year 14 Year 15 Year 16 Year 17 Year 18 Year 19 Year 20 Shut-down TOTALS
Student Fees
Vendor Fees
Sponsorship Fees
Sale of buldings
TOTALS
Analysis (millions) Year Startup Year 1 Year 2 Year 3 Year 4 Year 5 Year 6 Year 7 Year 8 Year 9 Year 10 Year 11 Year 12 Year 13 Year 14 Year 15 Year 16 Year 17 Year 18 Year 19 Year 20 Shut-down TOTALS
Income (mil)
Expenses (mil)
Profit (mil)
Discount Rate
Net Present Value (mil)
IRR

The post What-If Analysis Excel Computer Data appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"

4-1 Discussion

4-1 Discussion

Projects that meet requirements and that are delivered on time and within cost begin with a solid project plan.

For your initial post, share your high-level project timeline, showing major milestones and deliverables. From Modules Four through Nine, you will design, implement, and test your project, and write a project report. Your project plan does not have to be too detailed but it does have to be realistic. It is critical that the timeline reflects the level of effort required to complete the project by the end of Module Nine. In the project plan, identify any risks to the project that currently need to be mitigated.

Then, compare and contrast your plan with your classmates. Were there any activities, risk, or tasks that your classmates omitted from their plans? Was there anything in their plans that you feel is omitted from yours?

The post 4-1 Discussion appeared first on graduatepaperhelp.

 

"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"