New! Hire Essay Assignment Writer Online and Get Flat 20% Discount!!Order Now

COIT 20246 ICT Services Management

Published : 05-Sep,2021  |  Views : 10


Discusses conventional binary computing architecture.  However, an emerging field of research focuses on the development of quantum computers.  Find and summarise a resource that explains why there is so much interest in this type of computing (for example, the potential applications).

Briefly makes mention of the Linux operating system, an open-source Unix variant developed in the 1990s.  There are many popular Linux distributions (look up “Linux Distribution” if you don’t know what that means).  Find and summarise a resource that describes the features and advantages of ONE of these distributions.discussed in the text.  Find and summarise a resource that describes the features and uses briefly mentions crowdsourcing which is a form of social business that organisations employ to engage with the public to achieve a goal.


Conventional binary computing architecture is based on “binary” pattern, that is made up of 0’s and 1’s only two digits. To store any type of data within computer system, we need to convert data in binary format. Integers and characters are the most basic fundamental data types in binary architecture of computer system, as these data types can be easily used to build any form of data.  From the beginning of computer systems, computer architects are continuously involved in the research to form different patterns for the representation of numeric values. Conventional computing system is not capable to use all the subsets of possibilities of computational problems. It does the same computation as people do by hand. That’s why there is need to invent such computing architecture that can handle all the possibilities of computational problems. (Why Do Computers Use the Binary Number System (0, 2017)

Quantum computers are computers which are based on the use of quantum states to store information. Since the computers are invented, we are very much relying on computers to store our useful data. It is proved by data scientists that if we follow the same trend of data storage we will not be capable to use all the machines around the globe by 2040. To overcome this situation the computer industry is now focusing on finding the efficient ways to use computer systems more efficiently. Quantum computing architecture has proved wonderful things to process large data in order to achieve simulation of mechanical processes based on quantum physics.

Information is stored in form of quantum bits. It differs from conventional binary bits as it can be in a super position of these states also. A|0} +B|}. Here A and B represents complex numbers. In quantum computers calculations are done on the concept of unitary transformations. It makes quantum computers to handle the possibilities which cannot be handled by the hand calculations. By using quantum computing more efficient algorithms can be created to solve the problems of factoring, searching and mechanical simulation problems. (, 2017)

The LINUX is an open source operating system. It was developed by Linus Torvald in the 1990s. It is variant of UNIX operating system. Main basic characteristics of LINUX operating system are: (, 2017)

It is an open source OS

Freely distributable

Cross platform operating system

Can be installed on almost all types of devices such as PCs, mobiles, laptops, netbooks, tablets, servers, gaming console systems and above all on supercomputers.

LINUX OS is packaged in form of Linux distributions for desktop PCs and servers. It includes Linux kernel as well as various tools and libraries which are supportive in basic operating system’s operations. Basically Linux is not a complete operating system, it is just a kernel. There are so many Linux distributions are available which uses the kernel of Linux and packaged it with other freely available software. While we want to install Linux operating system, we must pick a suitable Linux distribution from the available variants. (, 2017)

List of Linux distributions:

Ubuntu is one of the most popular and well known Linux distributions. Although it is based on Debian but it has individual software repositories. It has almost everything that we need to perform any type of task in our professional organization. It has all supportive essential applications such as office, web browser, media applications, email management applications and gaming applications etc. It is open source and can be shared. Ubuntu has preinstalled virus protection and in built firewall protection. That’s why it is considered as the most secure operating system world-wide. Ubuntu is backed by Canonical which is a global company if computer software field. Many services are provided by this company to Ubuntu such as management tool, patching, training support and new advancement and enhancement in current version.

Virtualization is a concept to create something virtual that is not actual. It can be considered as version or variant of something such as operating system, server, storage media or any network resource. Virtualization concept is used in computer system from the very beginning of hard drive use, in which we partitioned our disk drive in logical sections. Virtualization concept can be implemented not only in hardware but also in software also. Operating system can also be included in this category. In operating system virtualization one of the hardware (hard drive) can be used to run multiple operating systems. (SearchServerVirtualization, 2017)

Operating system virtualization

In operating system virtualization, an application, more than one operating system or storage media can be used in abstract manner. It refers to make possible the use of different operating

systems on the system hardware simultaneously. It can be helpful when we require different operating system for different applications. Operating system virtualization can also be described on the operating system level that is known as server virtualization. In this technique a standard operating system is used in such a manner that it can run various applications at the same time. Operating system is altered in such a way that it behaves like different existence. These all instances can accept commands from multiple users of different applications on the same machine. (, 2017)

A virtual machine just behaves in the same way as we are using our physical machine. A virtual machine has its own hardware such as central processing unit, hard disk drive, network interface card and RAM. Virtual machine runs in host operating system environment in form of guest. These two terms are used to differentiate both machines (the actual machine and guest machine that is virtual machine). In virtualization implementation a separate layer is inserted of software which is called hypervisor. This hypervisor layer is conceptually one level higher than actual operating system layer.

L4 layer is named as transport layer in OSI model of network and communication. Transport layer is responsible for transporting data packets. It behaves as liaison between the higher and lower layer of OSI network stack protocol. It provides the necessary functionalities which make data communication successful. For any type of data communication the confirmation of delivery of data packets can be determined by this layer of protocol. This layer also provides services regarding connection establishment between various applications. (, 2017)

Transport layer services can be broadly classified in two types:

Connection oriented services: In this type of services a dedicated connection is must between the devices before start continuing transfer of data. TCP protocol is used for this type of service. Transmission control protocol is responsible for connection creation and responding. Every data transfer must be acknowledged also by this protocol. TCP protocol is considered one of the reliable protocols of network suite.

Connectionless services: These services do not mandate a connection establishment between sender and receiver devices. It just receives the data packets and transfers to the receiving device. UDP (user datagram protocol) is the protocol which is used in these types of services in L4 protocol. It is unreliable protocol. It provides checksums to maintain the data integrity along with port numbers. There is lacking of handshaking mechanism in UDP protocol. (, 2017)

One more that is used on this layer is DCCP protocol. It stands for datagram congestion control protocol. It is mainly message oriented protocol. This protocol establishes a reliable connection between communicating devices. This protocol handles the congestion or traffic control. It provides flow based mechanism on data packets just like TCP but the delivery of data packets is not reliable. It is useful in applications where the delivery of data packets is based on time constraint and use of streaming is must to maintain synchronization between devices.

It targets mainly the computer system, their infrastructure and the network of computer devices. This whole network can be the target of any malicious act. In cyber-attack the identity of victim is mostly hidden. Cyber-attack can be in form of stealing, altering or destroy the target system. (Epstein, Smith and Wehner, 2017)

There are many examples of cyber-attack world-wide but here we are taking example of the most recent cyber-attack of WannaCry ramsomware attack. This virus has mainly targets the Microsoft windows operating system. The effect of this virus can be seen in form of encryption of data and unfair demand of payments in the Bitcoin crypto currency. After the impact of this virus the system demands a pay of $300 in bit coins, if the user does not pay the amount then after three days the amount will be doubled. This ransom is demanded by the WannaCry virus to decrypt the affected files. After the 7 days the virus will delete all the encrypted filed from the system. In this way there will be a huge loss due to this virus. The effect of this virus was started on 12 may 2017. Due to this virus near about more than 150 countries were affected and 230,000 computers. The mode of communication of this virus was local network and Internet. The windows operating systems those were not updated recently got affected. It was noticed that National Security Agency had released a crucial patch for windows PCs named the Shadow Broker on 14 March 2017. This patch was developed to remove any underlying security gap for supported systems. But many of the organizations had not used this patch.

The meaning of crowd sourcing is to engage people by using many different modes to achieve a specific target. It can be the target relevant to organization’s growth, any social act or may be for any future aspect. Crowd sourcing is one of the most techniques that are used by many organizations in current situations. Sometime it happens that Government want to implement some constitutional amendments and they also want the active participation of public. In this scenario the number of people involved in for any specific cause can predict the outcome of project.  It is now proved as multiplayer experience that is based on the collective participation and active involvement of users.

Social media is one of the most favorite tools of organizations to get in touch with customers. It totally depends on the will of customers if they want to involve as per tool and time available with them.

One very first example of successful crowd sourcing was DORITOS-Crash the super bowl. Doritos was the first company that used the crowd sourcing’s advantage in their business as marketing initiative. This company is sustainably number one as they have used smartly the concept of customer-created advertisements. One contest they launched with the name was “Crash the super bowl”. This contest is getting more and more popularity till today and there are number of subscribers available and increasing day by day. Till date Doritos has already given near about $7 million prize money through this contest. Not only this Doritos supposed to give $1 million for its final contest along with the opportunity to work with the most experienced and esteemed persons in the field of entertainment and media. One of the most popular ad of Doritos is from 2011 that proves that we cannot tease a hungry dog with Doritos chips…that’s amazing and mine favorite also.

Mass data collection is a term which describes the collection of data that may be legal or illegal. It is not always wrong or right. There are many situations when a company wants to startup and they need more and more data for their setup. Almost every company needs data for their business process. This data may be related with customers, their choices, preferences and the updates customers need in existing products or system. If the data is not available then it would be very difficult for any organization to plan their business strategy for future. There is much software available those can be used for data interpretation and analysis. After data analysis the decisions can be taken for the best.

Collections of large amount of data sometimes result in fraud. There are change of stealing confidential information of customers and misuse them for wrong purpose without any intimation to customers. One of the very simple examples of this is that while we purchase a SIM card, we must provide our identity proof and residential proof for the same. This may consist of PAN CARD, Voter ID card etc. Although this process is considered as secure, but sometimes this identity proofs are misused by terrorists. They hack the credentials and use numbers those are registered with the name of other persons.

Recently in UK the mass data collection is implemented to prevent any terrorism activity. The mass data collection is done by British spy agencies that would be helpful in preventing terrorism in some extent. In Britain as per David Anderson QC, the laws under section MI5, MI6 and GCHQ the rights are given to gather mass data collection.

It is saying that large ICT projects are more often failed as compare to small projects. We are taking one example of project of IBM which was developed in 1956. The project was based on the production of super computer. After 5 years they developed the IBM 7030 which was the first transistor based super computer. The first unit of this super computer was delivered to a national laboratory in 1961. This super computer was capable to handle near about half million instructions per second. It was considered the fastest computer in the world till 1964.

IBM bid for production of super computer was 100 times faster than the system which was to be replaced at national laboratory. But it was only 30 to 40 times faster. The goal was not achieved by this super computer and IBM had to down the price of this computer system to$7.8 million which was below the expected price $13.5 million. So, there may be chances in implementation of ICT projects where the overall costing of implementation is more than the profit gained after the ICT implementation.

The chances of ICT implementation failure are more in large projects, as on large level the planning for various key components must be done more dedication and precautions.

While any organization plan to move their business on information and communication technology, various points must be discussed within the organization. Some of the following points are mentioning here:

  • Financial status of organization
  • Flexibility to change or adaptation
  • Manpower with appropriate knowledge
  • The current technical platform of peers
  • Strategic planning for future

Scrutiny of the required sections within the organization where the ICT implementation can be prioritized.


Widman, J. (2017). IT's biggest project failures -- and what we can learn from them. [online] Computerworld. Available at: [Accessed 19 May 2017].

Get An Awesome Price Quote For Your Paper – Absolutely FREE!
    Add File
    Files Missing!

    Please upload all relevant files for quick & complete assistance.

    Our Amazing Features


    No missing deadline risk

    No matter how close the deadline is, you will find quick solutions for your urgent assignments.


    100% Plagiarism-free content

    All assessments are written by experts based on research and credible sources. It also quality-approved by editors and proofreaders.


    500+ subject matter experts

    Our team consists of writers and PhD scholars with profound knowledge in their subject of study and deliver A+ quality solution.


    Covers all subjects

    We offer academic help services for a wide array of subjects.


    Pocket-friendly rate

    We care about our students and guarantee the best price in the market to help them avail top academic services that fit any budget.

    Getting started with MyEssayAssignmentHelp is FREE

    15,000+ happy customers and counting!

    Rated 4.7/5 based on
    1491 reviews