DIGITAL SECURITY AND INFORMATION ASSURANCE


This blog is created to stimulate academic discussion in partial fulfillment of the degree of Doctorate of Computer Science in DIGITAL SECURITY AND INFORMATION ASSURANCE for the Colorado Technical University, Colorado Springs, Colorado.

Courses includes - EM835 Information Accountability and Web Privacy Strategies; SC862 Digital Security; Quantitative Analysis; Software Architecture and Design - CS854;















Sunday, March 30, 2014

Detection of LDAP injection atttack using Big Data Analytic tool in a distributed environment.



Privacy and computer security problems are the two major hindrances affecting the use of information technology since the early nineties. The use of the Internet for delivering services by organizations not alone fuel the issue but make it difficult to control. Hackers are always two steps ahead of conventional security solutions. Web applications are increasingly prone to attack because of known vulnerabilities and zero day attacks.
As a doctorate candidate, my work is researching the use of Big Data security analytic (BDSA) tools to detect LDAP misuse in a distributed environment. The misuse of LDAP is ultimately responsible for most LDAP injection attacks in LDAP-enabled web applications. By detecting LDAP misuse, Big Data Analytic tool will be able to stop injection attack in real or near-real time. Other security solutions out there are too slow to detect and stop injection attacks because of the latency and lack of event correlations among other reasons. Check back as information on the research will be made available timely.

Saturday, October 19, 2013

The intricacies of Research processes


The intricacies of Research Processes


Arguably, the four most important research elements that the article reminded me are design, measurement, analysis and high ethical standard in whatever research methodology or approach used in my doctorate dissertation. The authors emphasized how a solid research planning or strategy often produces an excellent research result that can contribute to the body of knowledge. This observation and experience according to the authors cut across both qualitative and quantitative research methodologies.

I was initially showing an interest to use qualitative research strategy for my research during and after the qualitative class last semester. But the dynamic has changed.
The reason is that quantitative method would enable me to use various statistical analysis procedures that may help to explain why the incidences of malware shows no abate on the web. I should be able to either use experimental or survey research design to focus on my research. My research committee under the direction of Dr. Stout has the final say on this one. Although the IRB process will be an integral part of the research to ensure that high ethical standards are followed.

The authors advised to do enough literature searches as part of research plan is no doubt important. This can help to narrow down the research focus and provide a mean to craft a well defined research proposal.
In addition, research reports according to the article must be presented in a format that is acceptable both to the school and the academic community. Peer review of papers to be published was also recommended for credibility.

Reference:


Burian, P., Rogerson, L., & Maffei III, F. (2010). The Research Roadmap: A Primer To The Approach And Process. Contemporary Issues In Education Research (CIER), 3(8), 43-58. Retrieved from http://journals.cluteonline.com/index.php/CIER/article/view/226/217

Monday, September 3, 2012

RBAC Approach and implementation

Access control is the heart of security and also the first line of defense in asset protection. Arguable so because it has the ability to allow only authorized users, programs or processes system or resource access to a particular object on a network or stand alone system.
In discretionary access control (DAC), the custodial of the data or information determines or specifies which persons or subjects can access the data or information resource. The access control to the information asset is at the discretion of the owner (Harris, 2010). 

Most DAC systems grant or deny access based on the identity of the information or data requester. Often DAC uses ACLs (access control lists) to grant or deny access to network resources. However, in mandatory access control (MAC) access to information resource is based solely on security labeling system. In which case users have security clearances and resources themselves have security labels with data classifications. MAC implementation is found in certain environments where information classification and high confidentiality are of paramount important. A good example is in the military. In MAC implementations, the system makes access decisions by comparing the subject’s clearance and need-to-know level to that of the security label (Harris, 2010). An essential feature of MAC is that the underlying operating systems enforce the system’s security policy through the use of security labels on information assets and the level of security clearance a user possesses. 

In contrast to the above two models is the role based access control (RBAC) sometimes called non –discretionary model. With RBAC, access to information resources is based on the role users are assigned in the organization and nothing more. Kayem, Akl, & Martin (2010) observed that role-based access control (RBAC) is a combination of mandatory and discretionary access control; and also RBAC models are more flexible than their discretionary and mandatory counterparts because users can be assigned several roles and a role can be associated with several users. 

Although the access control implementation will depend on the environment. But in a distributed environment where I have been privilege to implement RBAC, the RBAC model is the best out of the three because users’ role can be mapped to job function and authorization level. By using the authorization level, user privileges can be easily designed without having to resort to ACLs commonly used in DAC. In addition, RSAC according to Kayem et al. (2010) assigns permissions to specific operations with a specific meaning within an organization, rather than to low level files as in other models. The incident of Trojan horse infection on the network can be reduced by implementing RSAC. DAC is silent on the ways files are to be modified in network operations and this open more ground for security vulnerabilities.

I will use a centralized access control administration as a way to increase security because all access requests will go through a central authority. Visibility on access operations will be enhanced as administration is more simplify. As an administrator, I will only have to cope with a single point of failure and access performance bottlenecks on the network will be easily controlled (Smith, J.)


Reference:

Harris, S. (2010, 5th Edition). CISSP all in one exam guide. Columbus, Ohio: McGraw Hill.

Kayem, A., Akl, S., & Martin, P. (2010). Adaptive cryptographic access control. Advances in Information security, DOI 10.1007/978-1-4419-6655-1_2.

Smith, J. Access Control Systems & Methodology. Retrieved April 29, 2012, from 
www.purdue.edu/securepurdue/docs/training/AccessControls.ppt

Risk Management a must for Security

Risk management in information security program is one of the yardsticks of due diligence and care that formed the cornerstone of information security governance. One of the ways to incorporate risk management and assessment in the security program is to establish a security policy and procedure in the organization. The security policy will form the basis of risk management policy that will be tailored to address the following 

* Uncovering potential dangers in the environment

* Researching and understanding the vulnerabilities, threats and risks that is
peculiar to the environment

* Performing periodic security assessments

* It is absolutely essential to perform analysis of assessment data. This can be used to establish a security baseline with necessary security controls to adequately safeguard information assets.

We need to know where we are going and where we are coming from in term of security for the security program to succeed in any organization. The benefits of risk analysis are immeasurable because it helps to us to understand what exactly is at risk in the environment; to conform to due care and comply with legal and regulatory requirements (Harris, 2010). 

By performing risk analysis, we are in better position to know what security controls, countermeasures and safeguards to implement in order to re-enforce the environment security posture in view of known vulnerabilities and risks. For instance, the risk assessment could mean the patch management and anti-malware deployments should be more visible. According to Harris (2010), a risk analysis helps integrate the security program objectives with the company’s business objectives and requirements.

I will confront emerging threats by making sure that adequate security controls both technical and administrative are in place and also by fine tuning continuous education of users of the importance of information security as they perform their daily tasks. Security monitoring and awareness training will also be heightened to address all forms of social engineering and non- compliance with security policy and procedures. Without implementing adequate protection measures, enterprises are at risk of having their operations critically disrupted (Murphy & Zwieback, 2005). No amount of IDS, IPS and firewalls can offer the necessary protection if the users who are in the first line of defense fails to imbibe simple security rules.
The risk assessment will include a detail threat and vulnerability analysis, a thorough examination of countermeasure mechanisms as well as assets identification. Without these components the purpose of the risk assessment is defeated and the whole risk management program might be in jeopardy. 

References:

Harris, S. (2010, 5th Edition). CISSP all in one exam guide. Columbus, Ohio: McGraw Hill.

Murphy, J. & Zwieback, D. (2005). Managing emerging security threats. Retrieved April 24, 2012, from http://www.greetsomeone.com/pdf/inkcom_managing_security_threats.pdf

Cyber attack or terrorism is real

President Bill Clinton in 1998 put it rather direct “Our foes have extended the fields of battle – from physical space to cyberspace” (O’Hara, 2004). If our former president acknowledged this fact, I strongly believe that cyber terrorism or attack is a real and an ever expanding threat against our well being and a security challenge. O’Hara (2004) pointed out that cyber warfare is now a primary tool in the information warfare arsenal to achieve non-kinetic attacks which is the type of attack not aimed at physical destruction but is designed to impact the adversary’s will to fight and decision making process.

The US federal government has acknowledged that we are susceptible to cyber terrorism because digital security controls has not been built into most of our critical systems from the design phase and in the entire system life cycle. It is now that we are catching up and the resilient of our cyber protecting mechanisms are still questionable. Cyber attacks can easily be launched provided you have a computer; internet connection and a variety of hacking and cyber warfare tools which are available on a multitude of internet sites worldwide. The price of perpetrating a cyber attack is just a fraction of the cost of the economic or physical damage such an attack can produce: cyber attack is also characterized by aggressive enemy efforts to collect intelligence on the country’s weapons, electrical grid, traffic-control systems, and even its financial markets (Lipman Report, 2010). 

The damage to our critical infrastructures will be unprecedented if we are attacked by cyber criminals either sponsored by rogue states or organized criminals. Our transportation hubs, air-control systems, water treatment plants and telecommunication facilities are targets of such attacks and the impact on our lives will be so catastrophic and the economic loss will be immeasurable and may be worst than 911. Cyber warfare can negatively affect our economic prosperity in this century and beyond. Just of recent a cyber attack due to the Stuxnet worm caused international havoc and systematically shutdown the Iranian nuclear program.

The treat of cyber attacks is real if the likes of Google and Cisco networks can be hacked and attacked by the bad guys. Exploitable vulnerabilities are making our critical infrastructures unsecured to the point that hackers are just a step away of using malicious codes to take full control of even the highly classified systems. This is frightened but is the truth. It is only recently that the US federal government through the various agencies under the auspices of NIST sanctioned them to perform periodic the risk assessment of their systems and network infrastructures. The agencies are to develop remedial and mitigation plans to curtail security risks and other associated problems within a timeframe. However, if we anticipate and know of any imminent threats from cyber criminals or rouge states, we have absolute right to defend ourselves using all information security arsenals and even convectional weapons. Pre-emptive strikes should be part of our cyber security defense vocabularies and we must be capable of developing cyber offensive capabilities. Good defensive operations will point in the direction of the attacker, which then allows offensive operations to target them for retaliation (O’Hara, 2004). 

Urgent proactive actions such continuous monitoring , patch management and development of multiple layers of defense as well as perimeter securities are needed to guide against cyber warfare and other malicious intents. In addition, we need to train more security professionals who can design secure systems, write safe computer codes and create the ever more sophisticated tools needed to prevent, detect and mitigate and reconstitute systems after an attack (Lipman Report, 2010). We must not be complacent with our security and develop false sense of security when are still vulnerable to incessant cyber attacks.



References


O’Hara T. (2004). Cyber warfare and cyber terrorism. Retrieved April 12, 2012, from http://www.dtic.mil/cgi-bin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA424310

The Lipman Report (2010). Threats to the information highway: Cyber warfare, cyber terrorism and cyber Crime. Retrieved April 12, 2012, from http://www.guardsmark.com/files/computer_security/TLR_Oct_10.pdf


Godwin Omolola

Saturday, October 1, 2011

Common Quantatative Analysis Questions and Answers.

Below are common Quantitative Analysis related Questions and solutions

What is the nature of causation and what are the criteria for assessing causation?
The nature of causation is often to show that causes differ in significance and thus warrant further investigation and examination in details. This investigation often falls in three distinct categories to determine the necessary cause, sufficient cause and contributory cause. To show causation it must be obvious that the variables are statistically related. Causation cannot be prove with complete certainty.
Causation also differs in the timing of their influence.

The major criteria for assessing causation are whether the conditions for the cause do exist in the first place. Causation always makes it possible to conduct further investigation on correlation between variables.
It is important to know whether the causation is linear in nature or reciprocal in order to assess it. The linear causality assumes causal influence is in one direction. While reciprocal depict the causal influence is bidirectional or sometimes circular.
In quantitative research analysis, correlation can never in its entirety show causation because it does not imply causation.


Discuss the difference between experimental and correlation research?
Experimental researches involve processes that clearly define the design approach used in conducting the research in question. In a way, it emphasized how data is collected for final analysis. In experimental research, the researcher can manipulate his or her experiment and check the effect or results of the initial manipulation.
While correlation research involves analysis tools used in conducting the research. Variables are not influenced but only the relationship between them is measured and show by the researcher. In addition, sometimes there is correlations in experimental research but typically one of the variables is manipulated.

Finally, experimental research is only about experiment while correlation research is common in archival research, naturalistic observation, survey research, and even in case study.
 
How can surveys be designed to elicit the most valuable responses?

Surveys generally accomplish two main purposes to know new thing about the research and to be able to generalize from the survey data. Therefore in order to elicit adequate and most valuable responses from respondents, it is essential to use a design approach that is consistent with the following:
-The quality of survey questions must be well tailored to the focus group.
-The questions should be well designed, meaningful and sample size manageable.
- Appropriate survey techniques such as mailing, email, telephone must be employed and used in the circumstance under consideration.
- Appropriate language of communication that is comfortable with the audience especially during face to face encounter must be used.
- Survey questions must be devoid of ambiguity. It is important to test the questions.
- A face to face encounter with survey respondents often provides more tenable answers.
- The researchers must provide enough clarification being asked by survey respondents.
- There is no basis for being bias because a biased sample will produce biased results.


When do ethical issues become important in experimental research?
The ultimate aim of any experimental research work is to advance and contribute to the body of knowledge. Thus serious ethical issues become concerns when for instance there is glaring abuse in selection process of research participants. A good selection process must be undertaking and encouraged. It must be deemed fair and balance and not tainted to favor a group or set of people.
Informed consent must be afforded to all research participants partly to address any of their concerns as well as afford them the opportunity to know what the research is all about. If otherwise, the experimental research could be said to fail one of the ethical standards recognized worldwide in research projects.
The risks and benefits of the research must be well spell out and clarified to  subjects or research participants in order to alleviate issues that bother on ethical concern. Which I believe can later jeopardize the whole research efforts.
Discuss threats to validity.
Bias will increasingly be recognized as one of the most important threats to validity that must be addressed in the design, conduct and interpretation of research. Not removing bias can in the long run compromised the final interpretation of research result hence its validity.
Also, the low reliability of measures which can be attributed to factors such as poor question wording, bad instrument design or layout, illegibility of field notes are great threat to research validity.
Passing communication among research subjects who are not suppose to know it until the research exercise is over can affect the validity of the research in a negative way. This will greatly reduce the measurable effects of the research program because the information so obtain indirectly cast doubt on the true validity of the research results.

How can computer output for t-tests and confidence intervals be interpreted?
The t-tests output can be interpreted in the context of regression analysis whether a regression coefficient (b) is significantly different from zero. However, in the context of experimental work, it is used to test whether the differences between the two observed means are significant different from zero. For instance, SPSS provides the exact probability that the observed value of “t” would occur if the value of “b” in the population were zero. In this regard, if the observed significance is less than .05 (p <0.05), we may conclude it reflect a genuine effect from the population.
Confidence intervals can be interpreted from means by looking at the range of values that contain with some probability (95%) of the true value from the population. In a computer output, it represents the 95% of the difference between the lower and upper limits by estimate. This mean that the true value of the difference in population is somewhere between the limits and we can be 95% confidence of this value.

Are regression and ANOVA antagonistic methods.
The answer is affirmative no and undoubtedly, Regression and ANOVA are both based on the general Linear model (GLM). The techniques are recognized as two most useful statistical analysis tools be it in business, psychology or sociology circles. In fact, they are complementary tools and are not antagonistic as some school of thoughts argued. For instance and in most researches, when it is desirable to test a multiple regression equation for the statistical significant factors or measureable variables, it is imperative to start with ANOVA.
In the same token, when using ANCOVA for analysis the degree of covariance and in order to improve on an ANOVA technique it is usually incumbent on the researcher to use regression methods. The regression method applied in this way enable on the spot adjustment for a control variable before embarking on ANOVA. Reliability of statistical analysis and results in such research undertaking are further reinforced. So I submit to this forum that it depends on what the statistical research want to achieve and the information the researcher wants to convey to the public.
It is important to stress that for any statistical output to be considered validated when using ANOVA or regression techniques; certain assumptions must be fulfilled and tested. For example in ANOVA, measurable independent variables must be categorical.

Reference:
Field P.A. (2009). Discovering Statistics Using SPSS, 3rd edition.  Thousand Oaks, CA: Sage.
Vogt, W. P. (2007). Quantitative research methods for professionals. Boston: Pearson Education, Inc.

Thursday, September 29, 2011

Type 1 error and Type 11 error in Quantitative analysis

Type 1 error and Type 11 error
A type 1 error occurs when it is infer that a hypothesis is true when in actual sense is false. In other word, an experimental research is considered falsely successful to support a hypothesis. Example is when a patient is wrongly told that he or she has a highly infectious disease when in actual fact is not.  Also a driver is punished for no fault according to eye witness reports. In both cases, this in statistical parlance is a false rejection of the null.
Whilst the type 11 error is committed when it is deduced that a hypothesis is false when in actual sense is true. For example, a group of white Caucasians are wrongly classified as Orientals in an ethnic study. Also a negative pregnancy test when in reality the woman is says two months pregnant. This is a false acceptance of the null hypothesis