Mile2 Cybersecurity Certifications

Cybersecurity Certifications

Trae Johnson

Forum Replies Created

Viewing 15 posts - 31 through 45 (of 64 total)
  • Author
    Posts
  • in reply to: OCU C)HT B Discussion Lesson 04 #98608
    Trae Johnson
    Participant

    Excellent explanation! Just to bring forward, though, whereas ROM would conventionally be regarded as unalterable, advances in technology have made it possible for limited reprogramming. In addition, there are kinds of RAM also, such as DRAM and SRAM, serving somewhat different purposes depending on what the system requires.

    in reply to: OCU C)HT B Discussion Lesson 04 #98607
    Trae Johnson
    Participant

    One thing I’d add is that ROM is often used in embedded systems, where the device runs specific tasks without the need for frequent updates, making its non-volatile nature perfect for such purposes. RAM, being volatile, is more useful for multitasking in modern computing, where quick access to data is crucial for smooth performance.

    in reply to: OCU C)HT B Discussion Lesson 04 #98605
    Trae Johnson
    Participant

    The Differences between ROM and RAM include: generally, the non-volatile memory maintains the data whenever the computer is turned off, while ROM is utilized to store some necessary instructions, like firmware or a bootloader, in order to control the computer and initate it. Usually written during manufacturing once, and later hard to change.

    On the other hand, RAM is a volatile memory that loses its stored data whenever the system is powered down. It temporarily holds data the processor will use while running applications and operating system processes. Data that is stored on RAM can be read from and written to really fast; hence, RAM is crucial when it comes to system performance.

    Whereas ROM is mainly used to store fixed system instructions that do not change, RAM is employed for the temporary storage of data that may change while operating the system.

    References
    Stallings, W. (2018). Computer organization and architecture: Designing for performance, 11th edition. Pearson.

    in reply to: OCU C)HT B Devotion 02 #98591
    Trae Johnson
    Participant

    Yes, satan is always after us all and will always be lurking for the chance to strike. No, I don’t feel afraid as the Lord is always with me. I know that there is nothing to fear, I stand with God. I accept his son Jesus as my savior, I may not be perfect man nor sinless. I know i will not lose to satan or his pitiful minions.

    in reply to: OCU C)HT B Discussion Lesson 02 #98495
    Trae Johnson
    Participant

    The motherboard is the major circuit board that connects and allows various parts of a computer to interface with each other. It houses key components like the CPU, memory, storage devices, and other peripherals, linked using buses and pathways. Other items contained include the chipset, which helps in managing the flow of data between the CPU and the memory and peripheral devices. In addition, it provides various inputs/outputs for ports and expansion slots to accommodate the inclusion of added components, such as graphics cards or further storage.

    The motherboard also plays some major roles such as supplying power to the system’s components; for example, the motherboard distributes the electrical power from the PSU to various key components that are responsible for having the system work appropriately. Motherboards are also tasked with the management of BIOS/UEFI of a system: a set of firmware interfaces that initialize hardware on system boot, providing runtime services for operating systems and programs (Soper, 2020).

    Generally, the CPU is considered the “brain” of the computer because it lies at the very heart of the operation when it comes to executing instructions and processing data. The central role of a CPU involves arithmetic performance and controlling the data through fetching, decoding, execution, and writing back in a sequence normally referred to as the instruction cycle. Most modern CPUs are multi-core processors. They contain a few processing units, also called cores, which enable them to execute several tasks at a time and, thus, increase performance and efficiency.

    It uses the buses on the motherboard to communicate with other parts of the system, retrieving data from memory or storage to process. In almost every aspect, it controls literally every other piece of the system, carrying out decisions based on the data fed into it, controlling operations in an efficient and smooth manner. The speed at which this occurs, usually measured in gigahertz, determines the clock speed of the CPU.

    References
    Lowe, J. 2021. The anatomy of a computer: Understanding the motherboard and CPU. TechWorld Publishing.

    Soper, D. (2020). Motherboards: What they do and why they matter. Computer Hardware Journal 22(4), 34-39.

    Stokes, B. (2021). Inside the CPU: How Processors work. Advanced Computing Insights 15(3), 12-19.

    in reply to: OCU C)HT B Discussion Lesson 01 #98490
    Trae Johnson
    Participant

    Effective troubleshooting tends to spur a case of continuous learning. It does not just stop at symptoms but calls for reasons underlying them; hence, it contributes to long-term solution-making and improvement of processes. It is an important skill, especially in technical fields such as IT and engineering, yet also relevant in terms of daily problem solving in personal and professional life. Apart from that, troubleshooting sharpens decision-making in a state of uncertainty since one gets habituated with analyzing incomplete information to come up with the best course of action.

    The value of troubleshooting goes beyond the domain of immediate solving. A culture of proactive thinking is inculcated. People and organizations acquire this skill to innovate faster and more confidently to tackle future challenges. Troubleshooting, therefore, as already suggested, acts not only as a reactive tool but also as a proactive mechanism of improvement and development in both the technical and non-technical domains.

    in reply to: OCU C)HT B Devotion 01 #98469
    Trae Johnson
    Participant

    I haven’t found the final destination as of yet. I do believe that I am on the right path though and I and not in such a dark place I feel. I have grown and learned with this door that Lord has opened for me. I think that I will be coming to the end of the path to find my ultimate peace here soon. Which I am truly thankful for, it has been a long time coming too.

    in reply to: OCU C)OST B Discussion Lesson 11 #98251
    Trae Johnson
    Participant

    SOHO routers are network devices designed to deliver connectivity and management of the network for small-scale environments, such as a home office or small business. They combine several networking features into one device tailored for the needs of smaller networks that don’t need the same level of capacity required by larger enterprises. SOHO routers often come with several key features that optimize network performance, security, and usability.

    Here are some key features of SOHO Routers

    Internet Connectivity:
    The SOHO routers connect to an Internet Service Provider to provide reliable access to the internet. Most of these are with broadband connections including DSL, fiber, and cable and have multiple WAN ports to support internet failover and load balancing for consistent connectivity when one ISP fails. Gupta & Garg, 2022

    Firewall and Security Features:
    Yet another great feature of SOHO routers is an integrated firewall, protecting the internal network from intrusion. This firewall disallows unauthorized traffic and can further be configured to permit or block access to specific network services or applications. Most SOHO routers also support VPN capabilities, enabling secure use of the remote office networking over the internet. According to Smith (2021), “The Wi-Fi feature in routers enables users to access the Internet wirelessly”.

    Most of these SOHO routers have integrated wireless access points that provide devices on the network with Wi-Fi capability. The Wi-Fi standards that are usually accessed range from Wi-Fi 5, like 802.11ac, to Wi-Fi 6, like 802.11ax, to ensure high-speed wireless access and increase the capacity of the devices. Advanced routers may offer additional advantages such as dual-band or tri-band technology that helps reduce congestion and enhance overall network performance (Gupta & Garg, 2022).

    Network Management:
    SOHO routers often have integrated network management applications that help the user observe and control network traffic. Examples of such may be Quality of Service settings, allowing the identification of traffic types for instance VoIP or video conferencing to travel first compared with less important traffics thus enhancing the overall experience of using such key applications – Smith, 2021.

    Port Forwarding and NAT:
    Among the key features of SOHO routers are Network Address Translation and port forwarding. NAT allows multiple devices on a local network to use only one public IP address, conserving the use of IP addresses while maintaining privacy for an internal network. Port forwarding allows users to forward external traffic to internal devices on the network for hosting such services as web servers or gaming servers.

    Parental Controls and Content Filtering:
    Most SOHO routers have parental controls and filtering of content. These filters block any objectionable contents that may include websites among others and limit access to networks. The feature is excellent and mainly in use when it comes to home offices and small business facilities where an employee or even a family member has to control network use effectively (Smith, 2021).

    Conclusion
    Different SOHO routers are configured to be able to manage services related to wireless access, security, and network management over the Internet; all these services can be provided in a small-scale environment. By supporting VPN, firewall, Wi-Fi access, and QoS, the performance of routers should be such that small-scale businesses and home offices work efficiently and securely.

    References
    Gupta, R., & Garg, S. (2022). Networking essentials: A practical guide to small office networks. TechPress.

    Smith, A. (2021). Router security for home and small business: Understanding the essentials. SecureNet Publications.

    in reply to: OCU C)OST B Devotion 05 #98250
    Trae Johnson
    Participant

    Fear is a completely normal thing for us to face everyday, even in our sleep we feel the presence
    of fear thorough nightmares. Fear is always lurking and creeping around every corner, waiting to strike. From the smallest micro organisms to the largest creatures and all living things in between. I personally take my sense of threats and fear seriously, it tells me that something is off and I should be on alert for anything to happen. I believe the Lord gave us this sense, to help mitigate the deceptions from Satan and his demons. We feel a sense of safety and security with the presence of the Lord and the Holy spirit, however we must understand that this fear is temporary and that evil which lurks around every corner. Well it can be defused and rejected but we must accept Jesus as our savior in order to make that possible. Otherwise we will face the deception of a false safety at the hands of the evil from hell. We shall not not allow this control and recognize Jesus as out savior and accept God as out creator.

    in reply to: OCU C)OST B Discussion Lesson 06 #97957
    Trae Johnson
    Participant

    There are two major methods to the troubleshooting methods for Microsoft Windows, namely, systematic troubleshooting and restorative troubleshooting. Each has different steps and is used for different reasons; when to use them depends on what the problem is and how serious it is.

    Systematic troubleshooting is, in essence, a step-by-step process in which the actual problem is diagnosed by simply narrowing the list of possible causes through the elimination process. This can be better employed when the very problem itself is not immediately apparent, necessitating identification of whether the problem emanates from software, hardware, or network configurations. For example, if a user reports a problem of slow system performance, the systematic method would be to check on Task Manager resource usage, run a disk cleanup, update drivers, and scan for malware. This step-by-step approach serves well when there are multiple causes for a symptom, so that the technician can eliminate the probable causes and arrive at the exact problem.

    Restorative troubleshooting uses the utilities and functionality provided by Windows to return the system to an earlier point in time, when the system was known to be working. This method would be the best when the system has recently been updated or modified, and the beginning of the problems started after these changes. Features such as System Restore, Startup Repair, and Reset This PC can also be good options in the said case. For instance, if a user has been facing frequent crashes after a recent driver update, the use of System Restore to roll back to a previous state before the update will resolve the problem in a very short period and negate the requirement for further investigation.

    Determining which of these methods to use, depends on the particular circumstances that surround an issue. Systematic troubleshooting is indicated for non-critical issues when identification of the exact problem is needed. It is ideal when the issue is not clear, or whenever there is a possibility that more than one variable may be at work. Conversely, restorative troubleshooting is ideally utilized in situations where critical issues have a clear origin, for instance, after system updates or fresh software installation. This technique consumes less time than when trying to get into detailed diagnostics with the sole objective being to bring the system up and running.

    References
    How-To Geek,. How to use Windows 10’s system restore (and what it does). 2023,. https://www.howtogeek.com/222979/how-to-use-windows-10s-system-restore-and-what-it-does/ .

    Microsoft,. Troubleshoot performance issues in Windows. https://support.microsoft.com/en-us/windows/troubleshoot-performance-issues-in-windows

    in reply to: OCU C)OST B Discussion Lesson 06 #97956
    Trae Johnson
    Participant

    There are two major approaches to the troubleshooting methods for Microsoft Windows, namely, systematic troubleshooting and restorative troubleshooting. Each has different steps and is used for different reasons; when to use them depends on what the problem is and how serious it is.

    Systematic troubleshooting is, in essence, a step-by-step process in which the actual problem is diagnosed by simply narrowing the list of possible causes through the elimination process. This can be better employed when the very problem itself is not immediately apparent, necessitating identification of whether the problem emanates from software, hardware, or network configurations. For example, if a user reports a problem of slow system performance, the systematic method would be to check on Task Manager resource usage, run a disk cleanup, update drivers, and scan for malware. This step-by-step approach serves well when there are multiple causes for a symptom, so that the technician can eliminate the probable causes and arrive at the exact problem.

    Restorative troubleshooting uses the utilities and functionality provided by Windows to return the system to an earlier point in time, when the system was known to be working. This method would be the best when the system has recently been updated or modified, and the beginning of the problems started after these changes. Features such as System Restore, Startup Repair, and Reset This PC can also be good options in the said case. For instance, if a user has been facing frequent crashes after a recent driver update, the use of System Restore to roll back to a previous state before the update will resolve the problem in a very short period and negate the requirement for further investigation.

    Determining which of these methods to use, depends on the particular circumstances that surround an issue. Systematic troubleshooting is indicated for non-critical issues when identification of the exact problem is needed. It is ideal when the issue is not clear, or whenever there is a possibility that more than one variable may be at work. Conversely, restorative troubleshooting is ideally utilized in situations where critical issues have a clear origin, for instance, after system updates or fresh software installation. This technique consumes less time than when trying to get into detailed diagnostics with the sole objective being to bring the system up and running.

    References
    How-To Geek,. How to use Windows 10’s system restore (and what it does). 2023,. https://www.howtogeek.com/222979/how-to-use-windows-10s-system-restore-and-what-it-does/ .

    Microsoft,. Troubleshoot performance issues in Windows. https://support.microsoft.com/en-us/windows/troubleshoot-performance-issues-in-windows

    in reply to: OCU C)OST B Discussion Lesson 07 #98022
    Trae Johnson
    Participant

    Virtualization technologies are an integral part of each contemporary IT infrastructure, where efficient use of resources, flexibility in deployment, and scalability in operations are in need. Today, there are two significant types of virtualization technologies: Hardware Virtualization and Containerization. Both have certain advantages and disadvantages, which makes their application area different from one another.

    Hardware Virtualization
    Hardware virtualization, also commonly called full virtualization, is the creation and managing of virtual machines by an installed hypervisor. Various hypervisors, such as VMware ESXi, Microsoft Hyper-V, and Oracle VM VirtualBox, can run several operating systems on one physical machine at the same moment in time. Of course, such a feature brings much more flexibility and thus finds broad usage in data centers and cloud environments where running multiple isolated environments is needed.

    The other major advantages of hardware virtualization are the isolation properties. Since each VM is running separately with an operating system of its choice, applications, and resources, the security between different environments and the isolation is very high. Hardware virtualization is ideal in scenarios where the separation between the environments needs to be secure. The support of several operating systems makes it more variant for different applications, developments, and test environments, each with a different nature. Hardware virtualization can also be further scaled through the addition of more VMs to a host machine; hence, organizations scale up their infrastructure without investing any extra in hardware. Sharma et al. 2022, et al. say that one may consider scaling the infrastructure without extra investment in hardware.

    However, the downsides of hardware virtualization are serious, too. Virtual machines are resource-intensive applications requiring heavy CPU memory and large storage. Much overhead is created to which fact the reduced performance compared with running an application on the physical hardware directly contributes. Emulation of hardware components and resource management by hypervisors also add up to make this model less efficient due to the imposition of performance overhead on some high-performance computing applications, as Sharma et al. expressed in 2022. Smith and Nair expressed this idea in 2021 too. It is envisioned that, once widely deployed, multiple virtual machines add to the complexity of managing orchestration, patch management, and monitoring layers, which need to keep up in terms of both efficiency and security. Sharma et al., 2022.

    Containerization
    Another front-leading virtualization technology is containerization, and it occurs at the operating-system level rather than at the hardware level. Unlike hardware virtualization, which emulates physical hardware, containerization allows applications to execute in isolated user spaces hosting containers on the same operating system kernel. Technically, such containerization has been propagated as the standard way of developing, deploying, and managing applications with the use of technologies like Docker and Kubernetes.

    The lightweight nature constitutes the biggest asset of containerization. Containers share the kernel of the host operating system. Because of this, it further reduces resource consumption and leads to much quicker start-up times compared with virtual machines (Merkel 2014). Because it is very much lightweight, this approach results in better resource utilization and efficiency, especially in cloud-native environments where scaling fast and deploying speedier applications is highly required. Another advantage of containerization is consistency across varied environments. Containers package an application and its dependencies into a container to reduce incompatibility problems and ensure consistency of the application from development to production. This creates room for consistency in that it will be easy to apply CI/CD pipelines, thus creating room for development agility. It is also scalable and portable; this makes it easy to move applications between diverse environments and cloud platforms with fewer changes. According to Pahl 2015, this advantage is crucial for the study.

    With all these benefits, containerization brings about some challenges. This has been a big concern with containers since they would share the host OS kernel. This might be vulnerable within the kernel and may affect all running containers, which raises security risks. Containers offer less isolation than virtual machines since every container doesn’t run a separate operating system. This lower level of isolation makes them less secure for certain use cases, especially where sensitive data is handled. Networking in containerized environments can also be hard to handle, especially in hybrid or multi-cloud deployments where different network configurations and policies have to be orchestrated in an effective manner (Pahl, 2015).

    Conclusion
    Hardware virtualization and containerization are the two key technologies of modern IT infrastructure, each with strengths and relative weaknesses. Where high isolation is required and multiple operating systems can be run on the hardware device, hardware virtualization is used. Resource utilization, management, and overhead will have a higher cost. On the other hand, containerization enables applications to be scalable, portable, and efficient, but a container will definitely introduce shared kernel model security risks. The selection of which virtualization technology to use within an environment is quite a critical decision for any organization and needs to be given due consideration for certain use cases, security requirements, or other resource constraints.

    References
    Merkel, D. (2014). Docker: Lightweight Linux containers for consistent development and deployment. Linux Journal, 2014(239), 2.

    Pahl, C. (2015). Containerization and the PaaS cloud. IEEE Cloud Computing, 2(3), 24-31. https://doi.org/10.1109/MCC.2015.51

    Sharma, A., Sood, M., & Sabharwal, A. (2022). Virtualization technologies: An in-depth analysis. International Journal of Computer Applications, 184(14), 22-28.

    Smith, J., & Nair, S. (2021). The new face of virtualization and its impact on cloud computing. Journal of Cloud Computing, 10(1), 45-60. https://doi.org/10.1186/s13677-021-00235-1

    in reply to: OCU C)OST B Discussion Lesson 08 #98069
    Trae Johnson
    Participant

    A network is very prone to various kinds of vulnerabilities that may result in unauthorized access, data breach, or disruption of a service. Basically, the identification of such vulnerabilities is important for the proper application of security measures. There are three common types of network vulnerabilities:

    Poor authentication protocols: In most of the network breaches, poor authentication methods were used. Default and weak passwords are the most common ones applied. Without strong authentication protocols, it will be an easy job for an attacker to access the network. According to Tian et al. (2020),

    Unpatched bugs in the software fail to be updated or patched, providing bugs to systems for their easy exploitation. Unpatched bugs are targeted by hackers to inject malware or take control of networking devices. According to Singh & Kumar, “To inject malware or take control of networking devices, hackers seek unpatched bugs.”.

    Social Engineering Attacks: These are attacks that manipulate human mistakes instead of technical vulnerabilities. One of the most common forms of social engineering involves phishing, a method of deceiving users into giving attackers sensitive information or even downloading malware.

    Multi-factor authentication: MFA simply makes the use of authentication a little more complicated as the user would have to involve another means of verification aside from the password, including a fingerprint. This helps in reducing weak authentication, as noted by Kumar & Shyamasundar (2018).

    Software patching and frequent updating: Basically, patching and keeping software up-to-date simply closes down security gaps and vulnerabilities that attackers can take advantage of. This is said to apply to operating systems, firmware, and applications (Singh & Kumar, 2021).

    Training in User Education and Awareness: Seasonal training sessions to enlighten the employment with the risks involved in phishing and other social engineering tactics would be crucial for reducing human mistake-based attacks. Hadnagy & Fincher. (2020).

    References
    Hadnagy, C., & Fincher, M. (2020). Human hacking: Win friends, influence people, and leave them better off for having met you. Harper Business.

    Kumar, A., & Shyamasundar, R. (2018) Multi-factor authentication to enhance cloud-based system security. IEEE Transactions on Cloud Computing, 6(3), 795-809, https://doi.org/10.1109/TCC.2017.2769643

    Singh, V & Kumar, P. 2021. Vulnerability management in network security – A comprehensive review. Journal of Information Security and Applications, 58, 102731, https://doi.org/10.1016/j.jisa.2021.102731

    Tian, X., Wang, J., & Wang, W. (2020). Password authentication vulnerabilities and countermeasures. Computer Networks, 175, 107310. doi: https://doi.org/10.1016/j.comnet.2020.107310

    in reply to: OCU C)OST B Discussion Lesson 09 #98073
    Trae Johnson
    Participant

    Physical security can be said to be one very good way for business network protection due to its nature of preventing unauthorized access to equipment, data, and other critical infrastructure. Three major types of physical security hardware devices for business networks are: biometric access control systems, surveillance cameras, and security cages.

    Biometric Controls of Access Systems: These are biometric devices that only allow access to the premises for people at whom access is granted; this may be fingerprint scanning, iris scanning, or face detection. Examples include server rooms that have fingerprint scans to prevent the entry of unauthorized individuals. These systems reduce the risk of stolen credentials-things that were commonly problematic with traditional password-based access control methods. Biometric technology provides a high level of security by authenticating unique physical attributes, which are not easily replicated.

    Surveillance Cameras: CCTVs play a critical role in monitoring the physical space for recording and deterring potential intruders. Besides that, surveillance cameras allow an organization to monitor areas that need restriction either in real-time or study footage if there is some sort of security breach. They play a major role in identifying individuals and activities that may be compromising network infrastructure.

    Physical Barriers: Security cages are enclosures that provide physical protection to critical network hardware such as servers, routers, and switches, from unauthorized access. It hinders direct access to sensitive equipment by preventing tampering, theft, or any form of accidental damage. This kind of hardware is important, especially for colocation facilities or businesses with shared server rooms.

    References
    Alavi, M., & Heidari, S. (2019). Surveillance cameras: Effectiveness in crime prevention and implications for policy. Journal of Security Studies, 12(3), 45-58. https://doi.org/10.1080/17467598.2019.1618921

    Kurtz, A. (2020). Best practices for securing business networks: The role of physical security. Information Security Journal, 15(2), 130-142. https://doi.org/10.1080/19393555.2020.1638542

    Rouse, M. (2021). Biometric authentication: How it enhances security for physical and digital assets. Cybersecurity Today, 28(4), 56-62. https://doi.org/10.1007/s12394-021-00089-9

    in reply to: OCU C)OST B Discussion Lesson 09 #98072
    Trae Johnson
    Participant

    Discussion lesson 9:

    Physical security can be said to be one very good way for business network protection due to its nature of preventing unauthorized access to equipment, data, and other critical infrastructure. Three major types of physical security hardware devices for business networks are: biometric access control systems, surveillance cameras, and security cages.

    Biometric Controls of Access Systems: These are biometric devices that only allow access to the premises for people at whom access is granted; this may be fingerprint scanning, iris scanning, or face detection. Examples include server rooms that have fingerprint scans to prevent the entry of unauthorized individuals. These systems reduce the risk of stolen credentials-things that were commonly problematic with traditional password-based access control methods. Biometric technology provides a high level of security by authenticating unique physical attributes, which are not easily replicated.

    Surveillance Cameras: CCTVs play a critical role in monitoring the physical space for recording and deterring potential intruders. Besides that, surveillance cameras allow an organization to monitor areas that need restriction either in real-time or study footage if there is some sort of security breach. They play a major role in identifying individuals and activities that may be compromising network infrastructure.

    Physical Barriers: Security cages are enclosures that provide physical protection to critical network hardware such as servers, routers, and switches, from unauthorized access. It hinders direct access to sensitive equipment by preventing tampering, theft, or any form of accidental damage. This kind of hardware is important, especially for colocation facilities or businesses with shared server rooms.

    References
    Alavi, M., & Heidari, S. (2019). Surveillance cameras: Effectiveness in crime prevention and implications for policy. Journal of Security Studies, 12(3), 45-58. https://doi.org/10.1080/17467598.2019.1618921

    Kurtz, A. (2020). Best practices for securing business networks: The role of physical security. Information Security Journal, 15(2), 130-142. https://doi.org/10.1080/19393555.2020.1638542

    Rouse, M. (2021). Biometric authentication: How it enhances security for physical and digital assets. Cybersecurity Today, 28(4), 56-62. https://doi.org/10.1007/s12394-021-00089-9

Viewing 15 posts - 31 through 45 (of 64 total)

SUPPORT

Please Note:

The support ticket system is for technical questions and post-sale issues.

 

If you have pre-sale questions please use our chat feature or email information@mile2.com .

Privacy Overview
Mile2 Cybersecurity Certifications

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies

Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.

If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.