No products in the cart.
ISC2 CCSP Exam Questions
Page 2 of 50
21.
Which of the following is a strategy for maintaining operations during a business-disrupting event?
-
BCP
-
DRP
-
BIA
-
SAMM
Correct answer: BCP
A business continuity plan (BCP) is a strategy for maintaining operations during a business-disrupting event. A disaster recovery plan (DRP) is a strategy for restoring normal operations after such an event.
Business impact analysis (BIA) focuses on identifying the business impact if an asset, system, or process is degraded or lost.
An OWASP software assurance maturity model (SAMM) can help organizations assess the security of their current development practices.
22.
There are four main steps in audit planning. Choose the correct sequence of audit planning steps.
-
Define objectives, define scope, conduct the audit, lessons learned
-
Define scope, conduct the audit, lessons learned, monitoring
-
Define objectives, conduct the audit, review results, perform a secondary audit
-
Define scope, define objectives, conduct the audit, monitor results
Correct answer: Define objectives, define scope, conduct the audit, lessons learned
There are four main steps in audit planning as listed below in the correct order:
- Define objectives
- Define scope
- Conduct the audit
- Lessons learned (and analysis)
It is essential that the process is followed carefully, and it is critical to understand the security status of the business. A controlled and careful approach is essential so that we first determine what will be examined and the approach and technologies that will be used.
23.
Which of the following is NOT the name of a monitoring service of a major CSP?
-
CloudLog
-
CloudWatch
-
Azure Monitor
-
GCP Operations Suite
Correct answer: CloudLog
Cloud service providers often offer their own monitoring services. Some of the major ones include:
- AWS: CloudWatch
- Azure: Azure Monitor
- GCP: GCP Operations Suite
24.
Dion is working with the operation team to deploy security tools within the cloud. They are looking for something that could detect, identify, isolate, and analyze an attack by distracting them. What would you recommend?
-
Honeypot
-
Intrustion Detection System (IDS)
-
Firewall
-
Network Security Group (NSG)
Correct answer: Honeypot
A honeypot consists of a computer, data, or a network site that appears to be part of a network but is actually isolated and monitored and that seems to contain information or a resource of value to attackers.
What makes honeypot a better answer than IDS is the final part to the question: "by distracting them." An IDS could detect and identify the attack, but the bad actor would not know it was there and be distracted by it. It is a tool that only monitors traffic.
A firewall might distract the bad actor but not in the same sense as the question indicates. The bad actor might take some time to explore the firewall, but a firewall is a real device. It is not advisable to add a firewall with the intention of distracting the bad actor, unless it was part of a honeypot.
An NSG is effectively a firewalled group in the cloud. So the statement above about firewalls applies the same to the NSG.
25.
Which of the following is NOT one of the critical elements of a management plane?
-
Monitoring
-
Scheduling
-
Orchestration
-
Maintenance
Correct answer: Monitoring
According to the CCSP, the three critical elements of a management plane are scheduling, orchestration, and maintenance.
Monitoring is not an element of the management plane.
26.
Which of the following types of testing verifies that a module fits properly into the system as a whole?
-
Integration Testing
-
Unit Testing
-
Usability Testing
-
Regression Testing
Correct answer: Integration Testing
Functional testing is used to verify that software meets the requirements defined in the first phase of the SDLC. Examples of functional testing include:
- Unit Testing: Unit tests verify that a single component (function, module, etc.) of the software works as intended.
- Integration Testing: Integration testing verifies that the individual components of the software fit together correctly and that their interfaces work as designed.
- Usability Testing: Usability testing verifies that the software meets users’ needs and provides a good user experience.
- Regression Testing: Regression testing is performed after changes are made to the software and verifies that the changes haven’t introduced bugs or broken functionality.
Non-functional testing tests the quality of the software and verifies that it provides necessary functionality not explicitly listed in requirements. Load and stress testing or verifying that sensitive data is properly secured and encrypted are examples of non-functional testing.
27.
Estella, the information security manager, is working with senior management to prepare and plan for a new data center. As a cloud provider, they know it is critical to ensure that their customers' data is protected as needed. One of the key industries that they serve is the health care industry, and Estella understands that there are specific laws that govern the protection of the Protected Health Information (PHI), which includes x-rays, blood tests, drug prescriptions, and so on.
What is the primary physical consideration that must be determined FIRST when building a data center?
-
Location
-
Budget
-
Natural disasters
-
Redundancy
Correct answer: Location
Location is the major and primary concern when building a data center. It's important to understand the jurisdiction where the data center will be located. This means understanding the local laws and regulations under that jurisdiction. In the USA, one of the relevant laws is HIPAA, which does have requirements for where data is stored geographically. Additionally, the physical location of the data center will also drive requirements for protecting data during threats such as natural disasters.
Natural disasters are something to consider, but location covers both natural disasters and laws.
Redundancy is important, but it will be designed and built in as the company progresses with this plan.
Budget is important, but the location, laws, regulations, and natural disasters would most likely be the first concern. These can all be discussed in the answer: location.
28.
Which of the following cloud deployment models is NOT defined in NIST SP 800-145?
-
Multi-cloud
-
Public cloud
-
Hybrid cloud
-
Private cloud
Correct answer: Multi-cloud
NIST SP 800-145 defines four cloud deployment models. They are:
- Private cloud: In private clouds, the cloud customer builds their own cloud in-house or has a provider do so for them. Private clouds have dedicated servers, making them more secure but also more expensive.
- Public cloud: Public clouds are multi-tenant environments where multiple cloud customers share the same infrastructure managed by a third-party provider.
- Hybrid cloud: Hybrid cloud deployments mix both public and private cloud infrastructure. This allows data and applications to be hosted on the cloud that makes the most sense for them.
- Community cloud: A community cloud is essentially a private cloud used by a group of related organizations rather than a single organization. It could be operated by that group or a third party, such as FedRAMP-compliant cloud environments operated by cloud service providers.
Multi-cloud environments use cloud services from multiple different cloud providers. They enable customers to take advantage of price differences or optimizations offered by different providers. While multi-cloud is growing in popularity, it is not currently defined in NIST SP 800-145.
29.
An organization is looking to balance concerns about data security with the desire to leverage the scalability and cost savings of the cloud. Which of the following cloud models is the BEST choice for this?
-
Hybrid Cloud
-
Private Cloud
-
Community Cloud
-
Public Cloud
Correct answer: Hybrid Cloud
Cloud services are available under a few different deployment models, including:
- Private Cloud: In private clouds, the cloud customer builds their own cloud in-house or has a provider do so for them. Private clouds have dedicated servers, making them more secure but also more expensive.
- Public Cloud: Public clouds are multi-tenant environments where multiple cloud customers share the same infrastructure managed by a third-party provider.
- Hybrid Cloud: Hybrid cloud deployments mix both public and private cloud infrastructure. This allows data and applications to be hosted on the cloud that makes the most sense for them. For example, sensitive data can be stored on the private cloud, while less-sensitive applications can take advantage of the benefits of the public cloud.
- Multi-Cloud: Multi-cloud environments use cloud services from multiple different cloud providers. This enables customers to take advantage of price differences or optimizations offered by different providers.
- Community Cloud: A community cloud is essentially a private cloud used by a group of related organizations rather than a single organization. It could be operated by that group or a third party, such as FedRAMP-compliant cloud environments operated by cloud service providers.
30.
Padma has deployed a technology that is a different switch technology than what they have been using for a very long time. With this new technology, it removes the decision-making process from the switch and moves it to a controller. This leaves the process of forwarding frames to the switch.
What technology has been deployed?
-
Software Defined Networking (SDN)
-
Virtual Local Area Network (VLAN)
-
Fibre Channel
-
internet Small Computer System Interface (iSCSI)
Correct answer: Software Defined Networking (SDN)
Within a Software Defined Network (SDN), decisions regarding where traffic is filtered or sent to and the actual forwarding of traffic are completely separate from each other.
A Virtual Local Private Network (VLAN) is used to expand a local area network beyond physical/geographical limitations. It does not remove the decision making from the switch.
Fibre Channel (FC) and internet Small Computer System Interface iSCSI are technologies that are used in Storage Area Networks (SAN) so that the devices can communicate with the connected switch with a protocol more efficient than Ethernet IEEE 802.3
31.
If either Structured Query Language (SQL) injection or cross-site scripting vulnerabilities exist within any Software as a Service (SaaS) implementation, customers' data is at risk. Of the following, what is the BEST method for preventing this type of security risk?
-
Input validation
-
Bounds checking
-
Output validation
-
Data sanitization
Correct answer: Input validation
Cross-Site Scripting (XSS) occurs on webpages. SQL injection can occur on a webpage or any form that a user fills out that has a SQL database on the backend. Both of these can be discovered or prevented if input validation is done. SQL commands are very recognizable, and the software can be coded to look for and block any inputs from the user that are SQL commands. XSS is also detectable within the HTML of a webpage. If the other page that a user is directed to is a different domain, it can be blocked, or at least notify the user that they are being directed to another site.
Bounds checking is a technique used in computer programming to ensure that an index or pointer accessing an array or data structure remains within the valid range of the data it is accessing. It is primarily used to prevent buffer overflows, array out-of-bounds errors, and other related vulnerabilities that can lead to security vulnerabilities or program crashes.
Output validation, also known as output verification or output validation testing, is a process in software development that involves verifying and validating the correctness, integrity, and quality of the output produced by a system, application, or module.
Data sanitization is the process of removing data from the media in some manner, such as overwrites or physical destruction.
32.
Which of the following data security methods requires secure random number generation?
-
Encryption
-
Hashing
-
Anonymization
-
Masking
Correct answer: Encryption
Cloud customers can use various strategies to protect sensitive data against unauthorized access, including:
- Encryption: Encryption performs a reversible transformation on data that renders it unreadable without knowledge of the decryption key. If data is encrypted with a secure algorithm, the primary security concerns are generating random encryption keys and protecting them against unauthorized access. FIPS 140-3 is a US government standard used to evaluate cryptographic modules.
- Hashing: Hashing is a one-way function used to ensure the integrity of data. Hashing the same input will always produce the same output, but it is infeasible to derive the input to the hash function from the corresponding output. Applications of hash functions include file integrity monitoring and digital signatures. FIPS 140-4 is a US government standard for hash functions.
- Masking: Masking involves replacing sensitive data with non-sensitive characters. A common example of this is using asterisks to mask a password on a computer or all but the last four digits of a credit card number.
- Anonymization: Anonymization and de-identification involve destroying or replacing all parts of a record that can be used to uniquely identify an individual. While many regulations require anonymization for data use outside of certain contexts, it is very difficult to fully anonymize data.
- Tokenization: Tokenization replaces sensitive data with a non-sensitive token on untrusted systems that don’t require access to the original data. A table mapping tokens to the data is stored in a secure location to enable the original data to be looked up when needed.
33.
Hao is responsible for vendor management at a large bank that relies on several vendors for different services at different times.
The third-party services include a public cloud provider for their Infrastructure and Platform as a Service (IaaS & PaaS) deployments. Hao has realized that vendor management is becoming complex and creating bottlenecks in the organization.
What international standard should Hao reference for handling supplier relationships>
-
ISO/IEC 27036
-
NIST SP 800-88
-
NIST SP 800-145
-
ISO/IEC 27036 17788
Correct answer: ISO/IEC 27036
ISO/IEC 27036 is a set of international standards that provides guidance on information security for supplier relationships. It focuses on establishing and maintaining secure relationships between organizations and their suppliers, ensuring the protection of information assets throughout the supply chain. It may not make things easier, but then again it might.
ISO/IEC 17788, also known as ISO 17788:2014, is an international standard that provides guidelines and definitions for cloud computing. It aims to establish a common understanding of cloud computing concepts, terminology, and models, facilitating communication and interoperability among different stakeholders involved in cloud-related activities.
NIST SP 800-145 defines cloud computing.
NIST SP 800-88 covers categories of media sanitization.
34.
Which of the following emerging technologies improves portability in the cloud?
-
Containers
-
Fog Computing
-
TEEs
-
Edge Computing
Correct answer: Containers
Cloud computing is closely related to many emerging technologies. Some examples include:
- Containers: Containerization packages an application with all of the dependencies that it needs to run in a single package. This container can then be moved to any platform running the container software, including cloud platforms.
- Edge and Fog Computing: Edge and fog computing move computations from centralized servers to devices at the network edge, enabling faster responses and less usage of bandwidth and computational power by cloud servers. Edge computing performs computing on IoT devices, while fog computing uses gateways at the edge to collect data from these devices and perform computation there.
- Confidential Computing: While data is commonly encrypted at rest and in transit, it is often decrypted while in use, which creates security concerns. Confidential computing involves the use of trusted execution environments (TEEs) that protect and isolate sensitive data from potential threats while in use.
35.
A distributed resource scheduler is a coordination element for which of the following platforms?
-
VMware ESXi
-
Kubernetes
-
Virtual Machine Manager
-
HyperV
Correct answer: VMware ESXi
A distributed resource scheduler is a coordination element for VMware ESXi. It facilitates access to resources and supports high availability.
There is no specific feature called distributed resource scheduler for Kubernetes, HyperV, or Virtual Machine Manager.
36.
Ariel, a server administrator at Acme Inc., wants to create a point-in-time backup of an entire virtual machine running on a type I hypervisor. The virtual machine runs an SQL database server.
Which of the following is the BEST choice to create a point-in-time backup in this case?
-
Snapshot
-
pg_dump
-
Database dump
-
iSCSI
Correct answer: Snapshot
In most virtualization environments, VM snapshots are performed at the hypervisor level without requiring specific software or agents installed within the VM. Snapshots create a point-in-time backup of the VM.
A database dump would create a database backup, not a point-in-time backup for the entire virtual machine. pg_dump is a PostgreSQL command for creating a database dump.
Internet small computer systems interface (iSCSI) is a network protocol for connecting to storage systems.
37.
Which of the following main goals of IRM is MOST concerned with HOW a user accesses a resource?
-
Access Models
-
Data Rights
-
Provisioning
-
Enforcement
Correct answer: Access Models
Information rights management (IRM) involves controlling access to data, including implementing access controls and managing what users can do with the data. The three main objectives of IRM are:
- Data Rights: Data rights define what users are permitted to do with data (read, write, execute, forward, etc.). It also deals with how those rights are defined, applied, changed, and revoked.
- Provisioning: Provisioning is when users are onboarded to a system and rights are assigned to them. Often, this uses roles and groups to improve the consistency and scalability of rights management, as rights can be defined granularly for a particular role or group and then applied to everyone that fits in that group.
- Access Models: Access models take the means by which data is accessed into account when defining rights. For example, data presented via a web application has different potential rights (read, copy-paste, etc.) than data provided in files (read, write, execute, delete, etc.).
Enforcement is not a main objective of IRM.
38.
An organization's communications with which of the following is MOST likely to include information about planned and unplanned outages and other information designed to protect the brand image?
-
Customers
-
Partners
-
Vendors
-
Regulators
Correct answer: Customers
An organization may need to communicate with various parties as part of its security and risk management process. These include:
- Vendors: Companies rely on vendor-provided solutions, and a vendor experiencing problems could result in availability issues or potential vulnerabilities for their customers. Relationships with vendors should be managed via contracts and SLAs, and companies should have clear lines of communication to ensure that customers have advance notice of potential issues and that they can communicate any observed issues to the vendor.
- Customers: Communications between a company and its customers are important to set SLA terms, notify customers of planned and unplanned service interruptions, and otherwise handle logistics and protect the brand image.
- Partners: Partners often have more access to corporate data and systems than vendors but are independent organizations. Partners should be treated similarly to employees with defined onboarding/offboarding and management processes. Also, the partnership should begin with mutual due diligence and security reviews before granting access to sensitive data or systems.
- Regulators: Regulatory requirements also apply to cloud environments. Organizations receive regulatory requirements and may need to demonstrate compliance or report security incidents to relevant regulators.
Organizations may need to communicate with other stakeholders in specific situations. For example, a security incident or business disruption may require communicating with the public, employees, investors, regulators, and other stakeholders. Organizations may also have other reporting requirements, such as quarterly reports to stakeholders, that could include security-related information.
39.
Carin is working at a real estate company as the information security manager. She was recently hired to begin to build a solid information security program. Up until now, the company has only had a few policies and procedures in place as well as desktop firewalls and a network Intrusion Detection System (IDS). She knows there is a lot of work to do to build a secure environment for the users, especially since they handle a lot of sensitive customer personal information. Today she is looking at how a data leak could occur within this business.
If they determine that the data is most likely to be leaked through their website when the bad actor is able to compromise a stored link that redirects the user to the bad actor's site where they enter and share their credentials with the bad actor, what phase of the data lifecycle would this be?
-
Use
-
Destroy
-
Store
-
Archive
Correct answer: Use
Since the user is logging in through the bad actor's site, this would be the use phase. The user is logging in to view the data. It is not being modified, nor is it being shared with someone else.
The data is stored on the website, or behind the website, but that is not what the user is doing. The user is accessing it now, so that is use.
Archival is when the data is intentionally moved into a long-term storage location. The data is not being moved in this question, only viewed.
Similarly, the data is not being destroyed. The bad actor may destroy it when they log in with the stolen credentials, but that is not the concern at the moment. That is in the future.
40.
Charlie is working with the developers as they build a new piece of software that will be able to store and retrieve data in the cloud. How does a piece of software access object, file, block, and database storage?
-
Application Programming Interface (API)
-
Transport Layer Security (TLS)
-
Security Assertion Markup Language (SAML)
-
Internet Protocol Security (IPSec)
Correct answer: Application Programming Interface (API)
Multiple types of cloud storage technologies use APIs to access data. Some common examples include:
- Object Storage: Object storage systems like Amazon S3, Microsoft Azure Blob Storage, and Google Cloud Storage
- File Storage: Cloud file storage services, such as Amazon Elastic File System (EFS), Azure Files, and Google Cloud Filestore
- Block Storage: Cloud block storage services like Amazon Elastic Block Store (EBS), Azure Disk Storage, and Google Cloud Persistent Disk
- Database Storage: Cloud database services, such as Amazon Relational Database Service (RDS), Azure SQL Database, and Google Cloud SQL
TLS is used to encrypt the transmission. TLS can be used to encrypt a RestFUL API and should be used. It is not the access method to actually find and retrieve a piece of data.
SAML can be used to authenticate the user before they are allowed to access the storage, but it too does not actually find and retrieve a piece of data.
Internet Protocol Security (IPSec) could be used to secure a Virtual Private Network (VPN) connection by encrypting the traffic. Or it could be used to connect the router at the office to the edge router in the cloud. Either way, it is like TLS. It is encrypting the data, not finding and retrieving the data.