Knowledge based systems

Knowledge base systems have become a class of computing class which has attracted a lot of publicity in the recent past. With the increased use of expert systems, there is been need to improve the use of knowledge based to execute this. There has been an influx of the use of systems that are computerized to come up with viable solutions for the many industrial processes in existence today. Knowledge based systems, and by extension expert systems, have played a crucial role in the computerization of the industrial processes in existence today. This paper presents a knowledge based system for the purpose of designing a cookbook that is used for coming with recipes so that the information for making the recipes is stored on a long-term purpose. Its main purpose is the integration of the various components of a knowledge based system to come up with a viable system. There has been a lot of concern in the storage of kitchen information in the households for a long time. It is also in this paper that a brief tutorial is given on knowledge base systems and the operational process of the systems. There have been complaints that the people keep on losing information on their best recipes. The paper first presents a history of the start of knowledge base use and the improvements that has been made to it. The heuristic approach finds a final solution for the use of knowledge based system for the design of a cookbook. It can further synthesize new solutions to come up with the specifications which are required. Details of the features of knowledge based systems in general are given. More details of the system are not in the scope of this paper. An example of the cookbook is given in this paper to show some of the critical aspects of the knowledge based system it explains the operation of the searching method of the heuristics method which has been proposed. It also shows the effectiveness of the use of knowledge based system in the design of cookbooks in general.

The capability of software which has been popularly known as expert systems is referred as knowledge based systems in this paper. Knowledge based systems has been a very active area of study in the recent past. It is attracting a good significance in software RD and the capacity of product development this in not only seen in universities but in corporations and government institutions as well. It has attracted particular attention from within competitive and sectors which are independent of information like financial institutions.
This paper will deal with an introduction of knowledge base systems. The papers which have been similar to this are Elam  Henderson (1983) and Ford (1985) which focused on decision support systems (DSS). The significance of this paper is to enable the managers to appreciate the use of technology in storing important information like designing cookbooks. There has been an outcry that the loss of important information in the industry sector because their might have been stored in places which are not safe. It also gives the importance of using this technology in specific areas.

1.1 Knowledge based technology
Having a clear understanding of this technology poses a challenge given the fact that the term has been used to mean different things for different scenarios. An example is the fact that experts systems has been defined by some people as computer programs which make use of knowledge and inferences to solve a problem which could have been regarded as difficult if it was to be solved by human beings perhaps difficult enough to require significant expertise. Yet others have defined as software which is created by bringing together and codifying the knowledge used by one or more experts and also which is also designed to perform a task which could require special expertise under normal circumstances. The last definition, at least for this paper, gives experts systems as programs which have reasoning by use of information which is symbolic in nature and use heuristics approach as opposed to algorithmic approaches they are flexible at both runtime and design level.

These definitions give a consensus which is broad in nature but gives us an ample scope for discussion as to the meaning of very key terms

A knowledge base is special database that is used primarily for management of knowledge. It thus provides a means for the collection, organization, and retrieval of knowledge in a computerized manner. It also represents a collection of data which have related experiences and their results are related to their problems and solutions.

For the sake of avoiding the unnecessary confusion in this paper, knowledge base systems will be used in this paper to mean the collection of the inferences that will be used to solve a particular problem that would be particularly difficult if solved by human beings. This is a subset of the experts systems. Beyond this, the knowledge based systems also has a collection of tools and techniques which are used to develop knowledge based systems. It also includes the manner in which the tools are applied, the education, training and the expertise that is needed to explain and expound their use.

Software development which is conventional in nature deals with procedures or algorithms which are concerned with the precise organization and maintaining of structured data. Conventional systems analysis have specific problems that they focus on and thus in the process build an understanding by way of accumulating information which are detailed and are concerned with all the cases that must be addressed. On the other hand, knowledge based systems are more concerned with the knowledge that is required for this to happen. Due to this, there is therefore a need to have knowledge engineers who will be concerned with descriptions of whole problem-domains. For the testing of the knowledge which is developed by the knowledge engineers, there is the use of individual cases.

The method of representing knowledge as rules and heuristics is more advantageous than the previous technology that was used for software development. One of the advantages is the fact that hard knowledge can not only be integrated and trapped into the computer and the associated systems but also has the capability of loosing knowledge which is of great significance also.

There are several ways of expressing knowledge. Few authorities clearly define the knowledge should be expressed. There are many ways in which knowledge is expressed by the most commonly used is what is commonly called production rules or antecedent-consequent-rules. They take the form
IF antecedent-condition
THEN consequence
IF antecedent-condition
THEN Loan-eligibility  N

The expression shown above can be used in expressing rules which are firm statements defined between variables. They are generally true and their basis is on causal models. They are sometimes used to express heuristics which are always true or may not be in some occasions. Heuristics represents surface knowledge about the domain while rules represent deep knowledge.

1.3 The process
Beyond the mere definitions of terms, management needs to understand the process of knowledge based systems. The diagram that is shown below shows the development and use of knowledge based systems as it is being practiced conventionally.

There are phases in the development of the knowledge base system. The first is the development phase where the knowledge is extracted from the people with the relevant knowledge which is required in the relevant area (Ramiler,  Swanson, 2004). The general term in literature referred to these people is experts but they are referred to as domain specialist in more general terms. The knowledge is commonly expressed in terms of antecedent-consequent rules. There are some cases where the knowledge expert feeds the knowledge directly to the knowledge repository but most of the times it is recommended that the knowledge goes through a programmer with some language used for coding. At some time later in the use of the system, the users will get directly to the knowledge base without consulting the knowledge engineer. The software that is used in the system makes some inferences to the rules that are stored in the knowledge base. The inferences are made to the specific data to particular cases or to the more general cases in the domain. A result is then given to the user in the form of diagnosis, prognosis, decision, or recommendation on what the user should do next. The results will always depend on the nature of the application that is being used. There are some instances that the user may request an explanation about how the software reached a certain conclusion. The software will have the capability to generate this

1.2.3 Areas of knowledge base systems
There are emergent areas of the knowledge base systems that are included in the schema. The areas are described below

Acquisition of knowledge which is taken automatically this is done through the analysis of cases which are historic in nature and are there for the sole purpose of assisting the knowledge engineer so that he can be able to create the knowledge base more directly.

There is also a general-purpose knowledge-base which has an example of an encyclopedia which is expressed in a form which is appropriate, or expressed in associational knowledge this knowledge has common knowledge or common sense. This may be used as a basis of building domain specific knowledge base.

The last area of the knowledge base is inherent machine-learning ability such as the results of new cases that are used to modify the knowledge base which is already in existence.

1.4 Knowledge Base System Tools
They are classified into three groups which include knowledge acquisition this includes rule-induction and other machine-learning models, knowledge representation, including models of semantic networks like order-attribute-value triplets and frames, with production rules inheritance plausible reasoning and logic programming and Inference procedures, including goal-directed backward-chaining and data-driven forward-chaining, non-monotonic reasoning and depth-first and breadth-first search strategies (Geelan, 2009).

They are slightly different from those of conventional programming. The techniques required to develop knowledge based systems cannot be assumed to be available within an existing MIS department therefore, they must be nurtured or purchased and retained.

1.5 Database use
The rise of online transactions has led to the use of databases more than before. There has been rise in dynamic knowledge based systems because, unlike the years of yonder where the database tools were oday, proprietary, there are good tools that are free and can be used to develop databases which are robust in nature and can handle data very efficiently. What is more, with the use of open source technologies and spirit, more people are using open source tools and technologies to develop databases with the least effort. Many of these databases are the source of information which is being circulated on the Internet, but who cares There have been online transactions including online banking, education, and online shopping. All these systems make extensive use of databases. The databases are no longer used to facilitate the storage of data alone they are also used to facilitate the working of these online systems.

Although the databases have proved to be of great benefit to the current information systems, there is a problem which is lurking and is a time bomb for the future systems. This is because the systems which are using these databases have insufficient security measures that they have put in place to counter any attacks that may be experienced in their system. With the coming up of knowledge based systems and being adopted by many data firms, the safety of the databases is of paramount importance to many organization. Also, the rise in computer fraud of late is enough reason to keep companies and IT professionals on their toes in databases security. Of late, computer hackers are no longer youngsters who are idling on the Internet but they are full time computer professionals who are aiming to get personal information so that they can use in online shops and online banks to wreck havoc to the owners of this personal information. The coming up of the Information system to manage the recipes is of paramount importance because any leakage in the system will disorganize the recipes. The Information Technology is growing very fast and in that sense, there are a lot of important data that are being posted to the Internet everyday and this data need to be protected from attack. Unfortunately, many companies are not aware of this threat and leave their information in the hands of unqualified staff that have less concern for this information.

Knowledge base data governance
Knowledge base is an emerging discipline which has been argued about by many professionals. The discipline encompasses a convergence of quality of data, management of data, business process management, and management of risks which surround the handling of data in any organization. It is through the use of data governance that organization gets a chance to exercise control over the business process and inferences associated with this.

Data governance is a collection of processes that ensures that important and confidential data are managed in a formal way throughout the enterprise. This process ensures that the integrity of data is assured and that people can trust data at any stage of handling the data for this to be achieved, people are held accountable for the management of the organization data and that they make sure that the quality of data is always high. It is also making sure that people are given the responsibilities of fixing and preventing data issues whenever these issues arise. This way, data is always efficient. It is about using technology, empowering people on the importance and the techniques that are required for the data to be managed efficiently. When companies desire to have a total control of their data, they have to empower their people and employ the right technologies so that this is tenable.

Data governance is as a result of bodies that are pushing for good data governance structures. Examples of this regulation include Sarbanes-Oxley, Basel I, Base II and HIPAA. There are also private data privacy organizations that have been set up to look into the implementation of data governance practices. The implementation of data governance will vary depending on the scope and the origin. Sometimes an executive will initiate the mandate to start managing the data, sometimes it is the initiative of the management council. It depends with the degree of receptiveness of new procedures within the organization.

1.6 Future of knowledge based systems
With cloud computing taking shape in the world today, there will be the need to have the knowledge based systems to be stored in the cloud. The systems that will be stored in the cloud will need some form of security. For this to be achieved there has to be logistics that the cloud lender will have to discuss with the users of the cloud computing.

In my case, I intend to have the information system being stored in the cloud so that many people can have access to the system no matter where they are. What will be needed is just having Internet connectivity. This will be more manageable than having to install the system in all the desktop computers. Cloud computing is a technology where data and different applications are stored on storage networks and servers which are located in a remote place and accessed by the users via the Internet. This technology of cloud computing allows businesses and consumers to get access to applications without the need of installing them on their own on-site servers. The applications are installed on remote servers. In the normal way of applications use, consumers purchase licenses for application software from their software provider and install them on their on-site servers. In cloud computing case, it is On Demand basis where consumers pay a subscription fee for the service. The use of this technology increases efficiency because the storage, memory and processing are centralized.

The private cloud is a different architecture on the mainstream version in the sense that smaller IT systems within the company firewall offer the same services of cloud computing but now on a closed internal network. The network may include division offices or corporate divisions within the company, other companies which are business partners, suppliers of raw-materials, resellers, and other organizations which are connected with the mother company. Private cloud, also known as internal cloud, forms one of the architectures of cloud computing. These are offerings that represent cloud computing on private networks. This type of cloud computing has been widely claimed to provide benefits like capitalizing on data security, corporate governance and reliability concerns. The disadvantage comes in the sense that consumers still have to buy, build and manage them, which defeats the reason why they shifted to cloud computing in the first place. This also does not benefit from lower-front cost of capital and has less hands-on management, which essentially makes it lack the economic model that makes cloud computing such an intriguing concept. Research has shown that cloud computing will be headed this way in few years to come.

1.6.1 Database encryption
Because the database will be stored in the cloud, there will be need to have security of the database in the cloud. This will be achieved through encrypting the database. Encryption is considered in three different layers of the system security and also gives different encryption solutions. Dependant on which layer would work best for your data, determine where the encryption will be performed- whether on the storage device, on the database for DBMS (Data Base management System) or application where the data originates. The different layers are database, application and storage layers. In the database layer, organizational data can be secure under administrators and authorised passwords used to access databases. The database layer is very risky especially for e-business. Hackers and attackers can use security vulnerabilities to get data if the security policies have not described all the security mechanisms confidential data. Database level provides encryption at the columnar level of the table in a database and hence can protect data if security and access controls are defined in this level. Security defined would be such as giving unique codes to records and defining access authentications. Encryption by DBMS has a limitation since it can only protect better data at rest and not data moving between databases and between applications. A safer database layers should have secure audit trails, access control and known user identification. You also require good encryption mechanisms. Application layer suites well for data elements that are processed, authorised and manipulated at the application tier. Some application interfaces such as JCE and MS-CAPI have flexibility in their framework that enable organisations determine where encryption and decryption should occur since they have leveraged standardised cryptography. Application cryptography has its risks. Standardised applications are never shipped to customer with the code. Hence for any changes to be done so that the encryption model supports the enterprise data as well determine which encrypted data can be accessed, then skilled resources are required to effect the applications. Application layer encryption may be deployed with database layer so that DBMS can decrypt data for particular functionality otherwise databases stores with procedures can overwrite application stored procedures and trigger operations differently.

APIs (Application Program Interfaces), a set of protocols, rules and routines may require change and this would mean data characteristics would also be affected. It would be almost impractical to change the code either due to unfamiliarity with the codes or limited IT resources. Storage devices should meet practical requirements for data security through support for multiple data storages and operating systems. They should also meet the requirement of Sharing of database information across applications and throughout the enterprise network. The data storage capacities should handle the amount of data on DBMS. Hardware level provides encryption at the relational level which is the database servers. Storage levels encrypt data at storage subsystems especially at the file level which is NAS or DAS (Direct Attached Storage). NAS (Networked Attached storage) is computer on the network that only provides data storage services for files to other devices on the network. NAS systems contain one or more hard disks, with logical arrangement, redundant storage containers or RAID arrays. Data storage devices can have administrative passwords and only authorised persons to access them. Storage devices should be properly fixed on server computers that are lockable. Hardware Proliferation for DAS system is a disadvantage. More equipment means less space for other business purposes hence requiring more licensing expenses, more setup time, and more hardware to troubleshoot and fix for any problems occurring.

Database protection is factually so if the cryptography keys for encryptiondecryption are securely generated and managed. Some of the government organisations that have looked into improvement and standardisations of mechanisms for key distribution, rotation, replication storage and disposal are such as ISO (International Organization for Standardization), ANSI (American National Standards Institute) and the ABO (American Banking Organisation). Though still, this has the limited wide spread usage of cryptographic products (Agrawal, Kiernan, Srikant,  Xu, 2002). Reviews in some of the key management solutions include secure system having central point of control for all layers of security hence key management allows flexibility for professionals to apply encryption at appropriate levels in the architecture. This is done by restricting keys to one database table that can only be accessed by administrators with privileges to access and decrypt. This is good but can be risky since security will be based on trust. Sensitive data can be compromised by malicious administration and events of such situations covered since trust is and will never are an IT policy of best practice.

Key management by custodianship is important especially where administrative encryption by several administrative privileges could compromise or course theft of data. Custodian is bestowed with the mandate for managing the multi-layer key management infrastructure, including the creation of keys, distribution of replacement of keys and the deletion of compromised keys (Agrawal, Kiernan, Srikant,  Xu, 2002). Strong authentications should be required for key management functions. The second architecture, discusses by Messinger, and Piech (2009) of cloud computing is the Hybrid computing where there is a composition of multiple internal andor external providers, which is typical for most enterprises. A cloud can be a description of a local device, say a Plug computer with cloud services. It can also describe a configuration which is a combination of virtual and physical assets, for example most environments which have been virtualized require physical servers, routers, or other hardware such as a network appliance acting as a firewall. The last architecture of cloud computing is the private cloud also known as the internal cloud. These are offerings that represent cloud computing on private networks. This type of cloud computing has been widely claimed to provide benefits like capitalizing on data security, corporate governance and reliability concerns. The disadvantage comes in the sense that consumers still have to buy, build and manage them, which defeats the reason why they shifted to cloud computing in the first place. This also does not benefit from lower-front cost of capital and has less hands-on management, which essentially makes it lack the economic model that makes cloud computing such an intriguing concept. Research has shown that cloud computing will be headed this way in few years to come.

Berry,  Djaoui et al, 2005 discusses the securities that are associated with cloud computing. They did not discuss the data that reside in the cloud.  Cloud computing has some attributes which must be assessed so that all matters of security and privacy is well tackled. The areas of data integrity, privacy of data, recovery of data, and evaluation of legal issues needs to be critically analyzed for risk to be minimized. Cloud computing providers like Google and their Apps engine, Amazon with their EC2 are providers whose computing can be defined as that with scalable IT-enabled capabilities which are delivered as a service to external clients by use of Internet technologies. It is therefore imperative that customers must demand proper explanation of security policies and should know the measures that these providers are taking in place in order to assure their clients that they will not be exposed to security vulnerabilities in their course of their use of these services. They should also be able to identify vulnerabilities which were not anticipated at first. The first issue to be considered when deploying cloud computing is the privileges given to users in order to access their data. Data which are stored outside the premises of an enterprise brings with the issue of security (Gronroos,  Ojasolo, 2004). Hoe safe is the data Who else assesses the data Data which have been outsourced bypass the controls of the personnel of the enterprise. The client should get as much information as possible about how the data is stored and how the integrity of this data is catered for. The providers should be asked specific information about their hiring of privileged administrators who will manage the data.

The second issue to be considered is the regulatory compliance. The consumers are responsible for the security and integrity of their own data even when this data is held and stored by other providers. In the case of traditional service providers, they are subjected to external audits by auditors who will normally check on the security policy of that enterprise. The cloud computing providers should accept to undergo these external audits and this should be agreed upon in written form. The other security policy to be considered is about the location of the cloud. In most cases, consumers do not know where the cloud is located and even dont know which country it is. What they care is that their data is being stored somewhere. The providers should indicate, in written form, their jurisdiction and should accept to obey local security policies on behalf of the consumers. Another issue is that consumers should be aware of the security breaches present with providers. Providers have always claimed that security is at its tightest in the cloud but this fact alone is not enough to assume security issues. It is good know that all security systems that have been breached were once infallible and so with newer technologies, they can be broken into. An example is Google which was attacked in 2007. Their Gmail services was attacked and had to make apologies. With this in mind, it is a good lesson to learn that even though systems might be tight in the cloud, it is not a full assurance that they will never be hacked. While providers of cloud computing face security threats, research has shown that cloud computing has become very attractive for cyber crooks. As the data become richer in the cloud, so should security become tighter

1.7 Technology used
Given the status of the cookbook system, there is a need to have a private type of cloud computing as opposed to setting up a public cloud. This will make the security of the system easier. Private cloud, and by extension cloud computing in general can be divided into two front end and back end. These two are connected to each other through the Internet. The front end usually represents the side of the clients who use the Internet while the back end represents the section consisting of the cloud itself. The front end will normally consist of the clients computer and software that act as the interface in order for the client to access the application from the Web server. Most of the Interfaces are the web browsers like Internet Explorer or Firefox. Other systems require unique applications that provide network access to clients.

Servers dont run at full capacity most of the time and therefore renders most of the processing power to waste. There is a way that the server can be manipulated into getting a notion that we have multiple servers on the network with each server running its own operating system. This technique is called server virtualization. With this technique, the servers are used optimally thus reducing the need to have more servers on the network. The back end of cloud computing consist of various computers, servers and storage systems, all of which form the cloud part of cloud computing. Theoretically, cloud computing can include all kinds of applications which are installed on their own dedicated servers. There is then a central server which is used to administer the system and monitors the traffic within the cloud. It also handles all the requests from the clients. Fro it to achieve all these, it follows a set of rules called the protocols and uses special kind of software called middleware. This software enables computers on the network to communicate with each other. The central server makes sure that everything runs smoothly.

There are cases where the clients will need more disk space and will therefore require the providers to have some provision for this to be met. The providers will, on top of this, have some extra storage for the storage of backup data so that in case there is a breakdown, they will be able to access the backup. The process of making copies of data as a backup is called redundancy. There will be the concern of the security of the database which has been stored in the cloud. This will mean that I will have a way of securing the database in the cloud.

1.8 Tools used for the system
There are other tools which shall be used to bring the Web 2.0 effects in the knowledge management system. The tools are open source in nature.

1.8.1 AJAX
Asynchronous JavaScript and XML, AJAX, is a new technology of developing Web applications. It provides the user a rich interface, which is more interactive. This toll will be handy when creating menus because the user will not have to wait for the ingredients to pop up to the screen. Unlike Web 2.0 where a web page had to be resent to fetch data from a server, AJAX will only send a small portion of the Web page to get the data required. All these happen behind the scene thus enabling the page to be loaded in high speed. This feature greatly improves download speed of most web pages. With our consulting business model, it will have a great improvement because our online users will appreciate the speed of our web site (Martin,  Hoover, 2008). This feature has improved the search speed of search companies like Google and Yahoo. When a user types some text on the search bar, there will be some autosuggestions popping on the screen. This is AJAX in place. With this on our system, especially our small search application, more users will be drawn to our Web site thus bringing more clients. The start-stop-start nature of fetching contents from the server is eliminated with AJAX.

1.8.2 Flex
This is a technology that enables applications to be deployed on the Internet on different platforms. With different development companies coming up with different tools it is imperative that these technologies blend with the ones already in place.

1.8.3 Google Web Toolkit (GWT)
This is a technology that is used to develop and debug AJAX applications. For our Web site to be fully Web 2.0 enable, it will require the use of AJAX. This technology, AJAX, is developed with JavaScript which lacks modularity and thus makes sharing, testing and reusing AJAX components difficult and fragile. There is the need of this technology to make sure that developers can develop their applications using Java development tools of their choice. They will translate the application to JavaScript and HTML for deployment. GWT offers the flexibility of writing AJAX in a mix of technologies.

1.9 Classes of applications
Software developed using the Knowledge based systems techniques has yielded direct action in the world(e.g. in enhanced environmental and process control system) the output from the Expert system has been a form of input to humans. The term expert system was meant to differentiate between those applications which were not intended to act as expert but rather to capture the knowledge of human experts so as to support decision making.

These applications are classified into two that is Adviser Knowledge based systems techniques applications, designed to help in human decision-making and genuinely expert Knowledge based systems techniques applications, designed replace human decision-making e.g. in intelligent environment control and process control systems (Ramiler,  Swanson, 2004). Some application produce diagnoses after interpreting available evidence while others offer predictions and some are related to industrial engineering matters like procedures followed in assembling the components.

2.0 Problems
Knowledge-Base-system tools have disadvantages and dangers like any other new technology. One of these disadvantages is lack of enough skilled personnel and immaturity of available tools, most expert systems deals with specific problems and thus do not support a complete job but rather one or two within a cluster of jobs (Martin,  Hoover, 2008). The advantage of which such software offers is seldom to completely automate the process and cuts down the costs. Also there are implications for the ownership of the software developed using the knowledge based system tools since it is not clear what the expert system constitutes.

2.1 Cookbook management system
The use of a knowledge base would be the best option for my cookbook management system. The problem with the current situation is that most users keep on losing relevant information dealing with the required ingredients for the best foods in the restaurants. The use of a knowledge base would be handy in this situation because the knowledge base will not be lost in this situation (Prescott,  McFadden, 2007). The system being described here is that of a cookbook which is used for creating categories and then adding recipes to the databases.

The purpose of creating this cookbook management system is due to the fact that there has been a lot of concern about people forgetting their interesting recipes for their best diets. There have been therefore the concerns to come up with a management system that will be used to store this important information. The system will be developed using VB.NET. The purpose for the choice of this programming language is the fact that it supports Internet applications. With the shift to Internet applications, there has been need to have systems which support Internet applications so there is easy compatibility with most of Web 2.0 applications. Since most of the new systems that are being developed today support Web 2.0 technology, there is need that the knowledge based systems that are developed also support this technology (Fowler,  Worthen, 2009). The database that will be used in the deployment of this system will be Microsoft SQL. This is preferred because it has high compatibility with Microsoft products like VB.NET. They are developed to complement each other. The connectivity to the database will be through the package that is available in VB.NET. Someone will have to have an account in order to log into the system and perform the necessary queries that will be required to come up with the recipe that is desired.

2.2 Components of the cookbook system
The cookbook system that will be developed will include
Rule based reasoning. The most common features of knowledge based systems that have been developed with a structure of a Rule based Expert system include user interface, they are friendly, they may contain some element of intelligence, may contain knowledge of how information representation is done, and have knowledge of user preferences with time.

Databases. The database will contain all the data that will of great interest to the system. This will be recipes in this scenario. The recipes for the system will include Calcium, Iron, Potassium, and other minerals which are of great concern in good meals. The database will be connected to popular restaurant databases so that for the recipes which are foreign in nature will be got from the foreign databases. There are databases which have been developed which contain most of the popular recipes in the world. Most of these databases are online in nature (Dodani, 2009). The users may also be considered as databases. This is because they are the source of other vital information in the system.

Inference engine. This will represent the methods and knowledge of solving particular problems in the system. The use of interpreter will be handy because it will be the interpreter who will analyze and process the rules. The scheduler will be concerned with which rule it will process next. The search part of the system will be used to process queries which will be fed to the system. There will be situations where the users will want to search particular recipes but they dont have all the required information to get it directly for the system. In such occasions, they will use the search system which is in the system.
Knowledge base (rule base). This is the place where most of the problem solving knowledge is stored. These are the IF conditions and THEN action. Much of the logistics are handled in this place. The condition of the rules is usually considered as a fact.

Knowledge based systems have great potential which have to be tapped and exploited. Although it appears to be a new technology, there are a lot that can be achieved in the use of this knowledge. There some dangers and challenges which must be appreciated and confronted for this technology to reach the scale of heights in which it is expected to. This paper has looked widely at the use of this technology in the development of a cookbook which will be used in the management of recipes for the most interesting meals.

0 comments:

Post a Comment