Which develop Internet resources on. Internet resource addressing. Basic terms used on the World Wide Web

The term “Internet resource” has historically been assigned to complexes of pages - sites, portals. In general, educational resources can be divided into the following groups: programs for schoolchildren, university developments, scientific works. Developers and website compilers present all these information products in different forms.

First of all, these are libraries and thematic collections of materials. They contain textbooks, lectures, methodological developments, articles and other useful works.

Secondly, online (working when connected to the Internet) training programs are widespread. Such programs, as a rule, exist in the form of tests (on various topics and levels of difficulty).

The third category of educational resources are programs that, once installed on your computer, make life easier in many ways. They can replace not only a pencil, ruler and calculator, but also provide ready-made algorithms and solutions for almost any problem. There are a great variety of such programs and they differ both in the breadth of functions they perform, as well as in quality, size and price. There are a lot of free and shareware distributed on the Internet. free programs, they can be “downloaded” and used under certain conditions (for example, there is a time limit for use), or without any restrictions. Useful program can perform a limited number of functions: for example, only draw graphs or contain the periodic table, or be a huge reference consultant in several disciplines.

A generally accepted classification of educational resources has not yet emerged. For example, in the review by N.N. Soboleva and others, resources useful for schools are divided into the following sections:

information thematic resources;

official resources;

distance learning projects;

online publications;

exchange of experience and communication between teachers and schoolchildren

Another approach to classification is possible, for example, based on the nature of the stored information and the way it is presented and used:

libraries and archives of texts and programs,

directories and collections of links,

business cards, showcases...

directories and databases,

test systems;

teleconferences and forums,

demonstration and interactive models

Let's look at typical examples of educational resources and features of their use.

One of the most popular Internet technologies is a website that incorporates a multiplicity of information presentation: text, graphics, animation, sound, text and graphic hyperlinks, interactive components. Among the huge number of sites, we can highlight those that may be of interest to the field of education. These include: educational portals, electronic textbooks, web quests, websites of educational institutions (schools, universities, IPKiPRO, etc.) and websites of institutions interesting for education (banks, enterprises, medicine, scientific laboratories, etc.). )

On the school website, students have virtual classrooms, diaries, briefcases, desks, objects; parents have a student diary, consultations with specialists, homework, parent-teacher meetings; teachers have a methodological base of materials, post information for students and parents; The administration promptly informs, diagnoses, and has statistics. The pedagogical concept of the school and its main structural elements determine the basic basis of the school website, which should act as a means of increasing the efficiency of all aspects of the school's activities. Teachers can conduct their own consultations for parents on the issues of teaching and upbringing schoolchildren on their school’s website, use the information databases of the Pedagogical Library named after. K.D. Ushinsky, libraries of the US Congress, University of London, Moscow State University, domestic and foreign archives, leading museums of the world. Teachers have the opportunity to use existing initiatives on the network and, within the framework of these initiatives, organize the work of students in a variety of network projects, or they can show their own initiative and organize a unique telecommunications project.

A web page is an integral part of a website. May contain text, images, hypertext links to other pages or to other servers. Physically it is a file, which is essentially the most important concept of an Internet resource, because Ultimately, it is individual files that are of interest to the Internet user, and all other concepts of resources are a combination of various files into a complex. The variety of files is great (not just web pages): any of them can be an Internet resource, which, if the user wishes, can be saved on the computer’s hard drive. These are programs, text and graphic files, various multimedia files, compiled HTML files, tables, archives, applications, etc.

Educational web quest - pages on a specific topic on educational sites that are connected by a large number of hyperlinks to pages from other sites on the World Wide Web. For example, a page for an astronomy course may have links to servers of actually operating observatories, libraries of research institutes, and space organizations. The efficiency of students’ work and saving their time searching the Internet for the necessary information depends on the careful selection of links. The student independently chooses which materials to view in detail and which not. The resources of the domestic Internet are already sufficiently developed to serve as a means for creating educational web quests. One of the famous Russian quests is the “Defend Baikal” website, developed by teachers and students of a gymnasium from Angarsk. A web quest involves the work of a group of experts according to a certain scheme, which involves analyzing a large number of information sources, a list of which is given on a separate web page of the site.

The word “portal” came to the Internet from architecture in the meaning “ main entrance" This refers to the site from which a person regularly begins his work on the Internet, which he does home page your browser.

The portal must combine web services, content and links to other resources in a way that meets the needs of a large number of users. The main idea of ​​the portal’s existence is that, having created a certain critical mass of services, it is possible to recruit such a number of users that it will be “self-replenishing”, after which the portal’s traffic grows with virtually no additional advertising costs.

An educational portal is a site that contains a large collection of structured links to educational Internet resources and its own educational pages. The portal is like the “main entrance” to the educational space of the network. Today, the federal portal “Russian Education” (www.edu.ru), the Russian General Education Portal (www.school.edu.ru), the portal “Humanities” (http://www.auditorium.ru/), and naturally -scientific educational portal (http://en.edu.ru/), information support portal for the unified state exam (http://ege.edu.ru/).

Electronic libraries - modern complex information systems - are considered as distributed knowledge repositories located on different computers. They provide a special type of broadcasting service. Most often, access to digital library catalogs is provided free of charge. However, now there are a huge number of projects trying to provide free access to various publications in electronic format, including teaching aids. At the present stage of Internet development, electronic libraries represent an area of ​​research and development aimed at developing the theory and practice of data collection, data modeling, data management and their distribution over data networks. The rapid development of the Internet and multimedia technologies in recent years has led to the emergence of methods for creating electronic information collections and have become the technical basis of libraries of the future. Popular electronic libraries include the “Moshkov Library” (http://lib.ru), “Network Library “Sit and Read” (http://lib.km.ru), “Open Russian Electronic Library (OREL)” ( http://orel.rsl.ru). A remote access laboratory is a department of an educational organization equipped with real educational and research equipment with remote access to it via telecommunication channels.

It is also necessary to consider in more detail the category of information educational resources - online educational programs.

Among the professional Russian developers, one can note the company “PHYSICON” (http://www.physicon.ru), which is engaged in the development of educational multimedia computer programs in the field of natural sciences (mathematics, physics, astronomy, chemistry, economics, biology and others).

Popular knowledge testing systems in various fields are implemented on many educational servers. For example, the website “Testing. Professional assessment of your knowledge" (http://tests.specialist.ru) provides an opportunity to assess knowledge in the field of information technology. Of the 33 tests offered, the most interesting are “Basic computer training” (Windows, MS Office), “Internet technologies” (Internet, HTML, Flash, web-mastering), “ Computer graphics» (Corel Draw, Adobe Photoshop, Adobe Illustrator, 3D Max, QuarkXPress), “Administration computer systems and networks" (Windows, Unix), "Programming" (C, C++, Delphi.), "Databases" (Oracle, Access), "Office specialties" (Office manager, assistant secretary), "Accounting" ( 1C: Accounting 7.7), “Project Management” (Microsoft Project).

Another popular site, “On-line Exams” (http://www.examen.ru), offers free exams (more than 40) and tests (more than 50) in many disciplines. Here is a large selection of essays, textbooks, reference manuals, which are formed into sections: “Human Sciences” (human anatomy and physiology, ecology, psychology, anthropology, medicine), “Natural Sciences” (chemistry, zoology, botany, general biology, astronomy, geography, genetics and selection, physics, paleontology), “Social and historical sciences” (archaeology, sociology, philosophy, economics, history), “Art, culture and religion” (architecture, literature). The exams provided on the server can be very useful. For example, “Traffic Inspectorate Exam”, “International English Language Exams”, “Microsoft Certification Exams”, “School Exams”.

The educational website “Anri education systems” (http://www.anriintern.com) contains a huge number of courses, lectures and useful materials not only for schoolchildren and students. The site offers you not only to test your existing knowledge, but also to acquire additional knowledge. For this purpose, courses on different topics are offered, for example, the course “European Regional Studies”.

The 50-lesson Business Course provides information on the practical steps to take when starting your own business and the requirements to follow.

The courses “On programming, development and operation of computers” are divided into several thematic sections. The textbook allows you to independently write programs and become familiar with basic algorithms and programming techniques. The Anri education systems website has a large selection of courses for self-education. For example, the following projects have been developed for students of English: “English in proverbs and sayings”, “Slang, aphorisms and colloquial speech in English”, “English through British legends, myths and fairy tales”, “English through reading the classics”, “Modal verbs”, “Idioms”, “Learn English jokingly”. Courses in German, French, Spanish, Czech, Chinese and Russian are open for study.

Courses have been developed in other disciplines: “Fundamentals of Ecology”, “Historical Courses”, “Geographical Courses”, “Marketing. Business in networks”, “Economics”, “Methodology for concluding contracts”, “Blind typing method on the keyboard”.

It should also be noted library resources, electronic archives in all branches of knowledge and arts, electronic means mass media, etc., playing the basic role of primary sources, having, however, very complex and ramified organizational and production structures. Students use the Internet to search for necessary information in information databases data to solve an educational problem, get acquainted with different points of view on the problem being studied, take part in various national and international network initiatives (olympiads, quizzes, competitions, projects), correspond with peers in their native and foreign languages, participate in domestic and international chat sessions, video, audio, teleconferences on various issues, publish their creative and journalistic works on the Internet (essays, drawings, images of crafts, articles, photographs, etc.), study in distance learning courses subjects, take part in testing conducted by universities. They also create information and educational web resources themselves.

The entire process of creating a website can be divided into four stages:

    Creation terms of reference for website development, preparation of site ideas and structure.

    Creating a design for a future website relative to the chosen structure.

    Adaptation and customization technical part (ImageCMS).

Adaptation general view of the site, display of information sections and technical modules.

Settings administrative part for convenient management of information on a specific site, creation of all necessary information sections, as well as technical modules.

    Preparation of materials And information content of the site.

Preparation of materials - this is a selection of materials that will be posted on the site, analysis of existing ones and adaptation for correct display on the Internet.

Information content of the site - these are all the main texts, as well as images of the site. In order to ensure maximum effectiveness of the site, information materials must contain all the basic answers to possible questions from users.

336 Standard design of an economic information system (EIS). Basic concepts and classification of standard design methods.

Standard design carried out on the basis of experience gained in the development of individual projects. Typical projects as a generalization of experience for certain groups of organizational and economic systems or types of work in each specific case are associated with many specific features and differ in the degree of coverage of management functions, work performed and project documentation developed.

Standard EIS design methods assume creating a system from ready-made purchased ones typical elements(standard design solutions). To do this, the designed EIS must be decomposed into many components (subsystems, complexes of tasks, software modules, etc.), for which standard design solutions available on the market are selected and purchased. Next, the purchased standard elements, usually including software products, are customized to the characteristics of a particular enterprise or modified in accordance with the requirements of the problem area.

By a standard design solution we mean a design solution presented in the form of design documentation, including software modules, suitable for reuse. Standard design solutions are also called replicable products.

Depending on the level of decomposition of the system, elemental, subsystem and object methods of standard design are distinguished.

In the elemental method of standard EIS design, a standard solution for a task or for a separate type of task support (information, software, technical, mathematical, organizational) is used as a standard element of the system. The advantage is the use of a modular approach to the design and documentation of EIS. Disadvantages - a lot of time is spent on pairing dissimilar elements, poor adaptability of elements to the characteristics of the enterprise.

When using the subsystem method, individual subsystems act as typification elements, which provide functional completeness, minimization of external information connections, parametric customizability, and alternative circuits within the values ​​of input parameters. In this case, a higher degree of integration of standard EIS elements is achieved.

With the object method, a standard project for management objects of a certain industry is used as a standard element, which includes a full set of functional and supporting EIS subsystems. The undoubted advantage of the object method lies in the integrability of all components due to methodological unity and information, software and technical compatibility of the components.

Types of Internet resources

The main goal of studying any subject in secondary school is the formation of communicative competence; all other goals (educational, educational, developmental) are realized in the process of achieving this main goal. The communicative approach involves learning to communicate and developing the ability for intercultural interaction, which is the basis for the functioning of the Internet. Outside of communication, the Internet does not make sense, since it is an international, multinational, cultural society, whose livelihoods are based on the electronic communication of millions of people around the world speaking at the same time. By engaging in online communication in class, we create a model of real communication.

The development of education today is organically connected with an increase in the level of its information potential. This characteristic feature largely determines both the direction of the evolution of education itself and the future of the entire society. For the most successful navigation in the global information space, it is necessary for students to master information culture, since priority in searching for information is increasingly given to the Internet.

How Information system The Internet offers its users a wide variety of information and resources. Basic set Internet services include:

electronic mail (e-mail);

teleconferences (usenet);

video conferencing;

the ability to publish your own information, create your own home page and post it on a Web server;

access to information resources:

reference catalogs (Yahoo, InfoSeek/UltraSmart, LookSmart, Galaxy);

search engines (Alta Vista, HotBob, Open Text, WebCrawler, Excite);

online conversation (Chat).

Due to the fact that modern transformations in Kazakhstani education are taking place in an era of rapid development of high technologies and expansion of the information space via the Internet, any of these resources can be actively used in the lesson, both the humanities cycle and the science and mathematics cycle.

Today, a teacher must have the skills to collaborate with students based on information interaction, be able to select, structure and evaluate information necessary to solve a wide range of educational problems. Changes in the structure and content of general and secondary education (UNT, specialized school, different kinds testing) led teachers to actively use the computer in the lesson not only of the senior level, but also in the lessons of the middle and junior level, where the topic of studying the Internet is not planned.

Lessons using Internet resources are a fusion of new information technologies with new pedagogical technologies: the teacher’s own position changes (I cease to be a “source of knowledge”, but become a co-author, organizer of the process of research, search, processing information, creating creative works in the implementation of an active approach to education).

The most common Internet resource used in school lessons is a website. The sites are easy to use and widely used by all teachers who have access to the Internet in the classroom. On search sites there are many links to educational sites in various subjects: mathematics, biology, geography, chemistry, physics, computer science, Russian language and literature, for primary education. Here you can find interesting lesson developments in text format, in the form of presentations, in the form of flipcharts, which can be used in preparation for a lesson or extracurricular event.

Let's give an approximate classification of sites. In practice, very often sites are combined. Classification of sites is needed to understand what type of site you specifically need, based on certain goals and objectives (Figure 1.1). Let us dwell in more detail on those types of sites that are most often used in the educational process of research work.

Informational resources

Thematic sites

This type of Internet site is characterized by the fact that it contains information on a specific topic. This also includes online encyclopedias. The volume of such a site can be from 10 pages or more. The bigger, the better. The format of materials can be any: simple text, video, audio podcasts, etc.

The peculiarity of the thematic site is that the free materials contained on the site are in open access and provide the visitor with information on any issue. For example, if a thematic site contains information about house plants, then it should contain information about caring for them, watering, replanting, fertilizers, etc.

Internet portals

Portals are a type of website containing a large amount of varied information. As a rule, portals are similar in structure to thematic sites, but have more developed functionality and a larger number of services and sections. Portals also often have sections for user communication: chats, blogs and forums.

A blog (blog - online diary) is a type of site on which the owner or editor of the blog writes posts with his news, ideas or other constantly incoming information. A distinctive feature of blogs is the relevance of the published information.

Blogs have replaced personal pages of users on the Internet. This is a kind of virtual diary, which is hosted on a special resource that provides the ability to add entries, comment, compile a list of friends, bookmark sites you like, etc.

Website directories

This is a type of site whose main content is structured links to other sites, as well as their brief descriptions. As a rule, sites are grouped by certain topics or have a narrow thematic focus (so-called thematic directories). Site directories can be moderated or unmoderated. An unmoderated directory (FFA) is a directory in which anyone can post a link to their site without verification by the directory moderator. In moderated directories, the moderator monitors the subject matter and quality of sites posted in the directory and may refuse assistance with placement, guided by certain directory rules.

Figure 1.1. Main types of sites

Web services

Search engines

A search engine is a special type of website with which a visitor can find the information he is interested in by entering a query in a special field and receiving a list of sites that match the query. It is these sites that both students and teachers most often use if they do not know where to find the information they are interested in. In the input field you need to enter a word or phrase that should be contained in the document. Most often, the subject of the file you are looking for is entered into this field.

Postal services

This type of site provides an interface for working with email. Email sites are sites that allow you to create (usually free) your Mailbox and manage it.

Internet forums

On this type of site, users can create topics for discussion and then comment on them. As a rule, forums are limited to one specific topic. The forum offers a set of sections for discussion. The work of the forum consists of users creating topics in sections and subsequent discussions within these topics. A single topic is, in fact, a thematic guest book.

This type of site, with the help of special functions operating on it (registration, user functions and moderation), allows visitors to communicate in real time. The chat looks like a window in which all messages from chat participants are displayed. Often in chats there are opportunities to view archives and send files.

Hosting sites

Sites of this type implement the function of storing any files. There are also often hosting sites with the ability to view downloaded files directly through the browser.

Hosting is a service for placing someone else’s website on your own web server or someone else’s web server on your own “site”, i.e. granting the right to connect to the Internet and its maintenance. As a rule, the demand for website hosting is much greater than for server hosting, since the latter is only needed for fairly large websites. In addition, hosting sites themselves are called sites or servers that provide this service.

File archives

File archives are a storehouse of various virtual information, ranging from articles to software. File archives are enormous in size and often separate computers - servers - are allocated for this type of site on the network. In addition, file archives often allow the visitor to upload his information to the appropriate section of the archive and, of course, download the needed file from there.

Basic Internet resources

Internet

Internet(eng. Internet) - a worldwide system of connected computer networks for storing and transmitting information. Often referred to as the World Wide Web and the Global Network, as well as simply the Network. Built on the TCP/IP protocol stack. The World Wide Web (WWW) and many other data transmission systems operate on the basis of the Internet.

Basic Internet Resources

Let's consider the main resources (services) of the Internet. The most popular Internet resource is, or WWW, which represents a huge amount (over a billion) of multimedia documents, distinctive feature which besides the beautiful appearance is the ability to refer to each other. This means the presence in the current document of a link that implements the transition to any WWW document that can physically be located on another computer on the Internet. (World Wide Web, World Wide Web) - a set of interconnected hypermedia documents

The next resource on the network is , which is a storage and transfer system for all kinds of files. FTP ( File Transfer Protocol, file transfer protocol) is a storage and transfer system for all kinds of files.

The oldest resource on the Internet is E-mail (electronic mail). Email ( Email) - an email forwarding system.

A global distributed system called News Groups is designed to conduct online discussions. One of the most popular systems of this kind is the Usenet newsgroup.

The Telnet service allows you to connect to a remote computer and work with its resources. This is a service for remote computer management.

Finally, the Internet has the IRC (Chat) system, which implements live communication between users in real time by entering text from the keyboard.

The World Wide Web

The World Wide Web(English) World Wide Web) - a distributed system that provides access to interconnected documents located on different computers connected to the Internet. To indicate World Wide Web also use the word web web"web") and the abbreviation WWW. The World Wide Web is the largest worldwide multilingual information repository in in electronic format: Tens of millions of interconnected documents located on computers located around the globe. It is considered the most popular and interesting service on the Internet, which allows you to access information regardless of its location. To find out the news, learn something or just have fun, people watch TV, listen to the radio, read newspapers, magazines, and books. The World Wide Web also offers its users radio broadcasting, video information, press, books, but with the difference that all this can be obtained without leaving home. It doesn’t matter in what form the information you are interested in is presented (text document, photograph, video or sound fragment) and where this information is located geographically (in Russia, Australia or the Ivory Coast) - you will receive it in a matter of minutes on your computer.

The World Wide Web is made up of hundreds of millions of web servers. Most of the resources on the World Wide Web are hypertext. Hypertext documents posted on the World Wide Web are called web pages. Several web pages that share a common theme, design, and links and are usually located on the same web server are called a website. Used to download and view web pages special programs- browsers. The World Wide Web has caused a real revolution in information technology and the boom in Internet development. Often, when talking about the Internet, they mean the World Wide Web, but it is important to understand that they are not the same thing.

History of the World Wide Web

Tim Berners-Lee and, to a lesser extent, Robert Caillot are considered the inventors of the World Wide Web. Tim Berners-Lee is the originator of HTTP, URI/URL and HTML technologies. In 1980, he worked for the European Council for Nuclear Research (Conseil Européen pour la Recherche Nucléaire, CERN) as a software consultant. It was there, in Geneva (Switzerland), that he wrote the Enquire program for his own needs, which used random associations to store data and laid the conceptual basis for the World Wide Web.

In 1989, while working at CERN on the organization's intranet, Tim Berners-Lee proposed the global hypertext project now known as the World Wide Web. The project involved the publication of hypertext documents linked by hyperlinks, which would facilitate the search and consolidation of information for CERN scientists. To implement the project, Tim Berners-Lee (together with his assistants) invented URIs, the HTTP protocol and HTML language. These are technologies you can no longer imagine without. modern Internet. Between 1991 and 1993, Berners-Lee refined the technical specifications of these standards and published them. But, nevertheless, the official year of birth of the World Wide Web should be considered 1989.

As part of the project, Berners-Lee wrote the world's first web server, httpd, and the world's first hypertext web browser, called WorldWideWeb. This browser was also a WYSIWYG editor (short for What You See Is What You Get). Its development began in October 1990 and was completed in December of the same year. The program ran in the NeXTStep environment and began to spread across the Internet in the summer of 1991.

The world's first website was hosted by Berners-Lee on August 6, 1991, on the first web server, accessible at http://info.cern.ch/. The resource defined the concept of the World Wide Web, contained instructions for installing a web server, using a browser, etc. This site was also the world's first Internet directory, because Tim Berners-Lee later posted and maintained a list of links to other sites there.

Since 1994, the main work on the development of the World Wide Web has been taken over by the World Wide Web Consortium (W3C), founded and still led by Tim Berners-Lee. This consortium is an organization that develops and implements technology standards for the Internet and the World Wide Web. W3C Mission: “Unleash the full potential of the World Wide Web by establishing protocols and principles to ensure the long-term development of the Web.” Two other major goals of the consortium are to ensure full “internationalization of the Web” and to make the Web accessible to people with disabilities.

The W3C develops common principles and standards for the Internet (called “recommendations”, English W3C Recommendations), which are then implemented by software and hardware manufacturers. This ensures compatibility between software products and equipment from various companies that makes World Wide Web more advanced, versatile and convenient. All recommendations of the World Wide Web consortium are open, that is, they are not protected by patents and can be implemented by anyone without any financial contributions to the consortium.

Structure and principles of the World Wide Web

The World Wide Web is made up of millions of Internet web servers located around the world. A web server is a program that runs on a computer connected to a network and uses the HTTP protocol to transfer data. In its simplest form, such a program receives an HTTP request for a specific resource over the network, finds the corresponding file on the local hard drive and sends it over the network to the requesting computer. More sophisticated web servers are capable of dynamically generating documents in response to an HTTP request using templates and scripts.

To view information received from the web server, a special program is used on the client computer - web browser. The main function of a web browser is to display hypertext. The World Wide Web is inextricably linked with the concepts of hypertext and hyperlinks. Most of the information on the Internet is hypertext.

To facilitate the creation, storage and display of hypertext on the World Wide Web, HTML (HyperText Markup Language) is traditionally used. The work of creating (marking up) hypertext documents is called layout, it is done by a webmaster or a separate markup specialist - a layout designer. After HTML markup, the resulting document is saved to a file, and such HTML files are the main type of resources on the World Wide Web. Once an HTML file is made available to a web server, it is called a “web page.” A collection of web pages makes up a website.

The hypertext of web pages contains hyperlinks. Hyperlinks help World Wide Web users easily navigate between resources (files), regardless of whether the resources are located on local computer or at remote server. To determine the location of resources on the World Wide Web, Uniform Resource Locators (URLs) are used. For example, the full URL home page The Russian section of Wikipedia looks like this: http://ru.wikipedia.org/wiki/Main_page. Such URL locators combine URI (Uniform Resource Identifier) ​​identification technology and the DNS (Domain Name System) domain name system. The domain name (in this case ru.wikipedia.org) as part of the URL designates the computer (more precisely, one of its network interfaces) that executes the code of the desired web server. The URL of the current page can usually be seen in the browser's address bar, although many modern browsers prefer to show only Domain name current site.

World Wide Web Technologies

To improve the visual perception of the web, CSS technology has become widely used, which allows you to set uniform design styles for many web pages. Another innovation worth paying attention to is the URN (Uniform Resource Name) resource naming system.

A popular concept for the development of the World Wide Web is the creation of the Semantic Web. The Semantic Web is an add-on to the existing World Wide Web, which is designed to make information posted on the network more understandable to computers. The Semantic Web is a concept of a network in which every resource in human language would be provided with a description that a computer can understand. The Semantic Web opens up access to clearly structured information for any application, regardless of platform and regardless of programming languages. Programs will be able to find the necessary resources themselves, process information, classify data, identify logical connections, draw conclusions and even make decisions based on these conclusions. If widely adopted and implemented wisely, the Semantic Web has the potential to spark a revolution on the Internet. To create a machine-readable description of a resource on the Semantic Web, the RDF (Resource Description Framework) format is used, which is based on XML syntax and uses URIs to identify resources. New products in this area are RDFS (English RDF Schema) and SPARQL (English Protocol And RDF Query Language) (pronounced “sparkle”), new language requests for quick access to RDF data.

Basic terms used on the World Wide Web

Working with the browser

Today, ten years after the invention of the HTTP protocol, which formed the basis of the World Wide Web, the browser is a highly complex piece of software that combines ease of use and a wealth of capabilities.
The browser not only opens the user to the world of hypertext resources on the World Wide Web. It can also work with other web services such as FTP, Gopher, WAIS. Along with the browser, a program for using e-mail and news services is usually installed on the computer. Essentially, the browser is the main program for accessing Internet services. Through it you can access almost any Internet service, even if the browser does not support working with this service. For this purpose, specially programmed web servers are used that connect the World Wide Web with this Network service. An example of this kind of web servers are numerous free mail servers with a web interface (see http://www.mail.ru)
Today there are many browser programs created by various companies. The most widely used and recognized browsers are Netscape Navigator and Internet Explorer. It is these browsers that constitute the main competition with each other, although it is worth noting that these programs are similar in many ways. This is understandable, because they work according to the same standards - Internet standards.
Working with the browser begins with the user typing in the address bar (address) the URL of the resource he wants to access and pressing the Enter key.

The browser sends a request to the specified Internet server. As elements of the user-specified web page arrive from the server, it gradually appears in the working browser window. The process of receiving page elements from the server is displayed in the bottom “status” line of the browser.

Text hyperlinks contained in the resulting web page are typically highlighted in a different color from the rest of the document text and are underlined. Links pointing to resources that the user has not yet viewed and links to resources that have already been visited usually have different colors. Images can also function as hyperlinks. Regardless of whether the link is a text link or a graphic link, if you hover your mouse over it, its shape will change. At the same time, the address to which the link points will appear in the browser status bar.

When you click on a hyperlink, the browser opens the resource to which it points in the working window, and the previous resource is unloaded from it. The browser keeps a list of viewed pages and the user, if necessary, can go back along the chain of viewed pages. To do this, click on the "Back" button in the browser menu - and it will return to the page you were viewing before opening the current document.
Each time you click this button, the browser will go back one document in the list of visited documents. If you suddenly go back too far, use the "Forward" button in the browser menu. It will help you move forward through the list of documents.
The "Stop" button will stop loading the document. The "Reload" button allows you to reload the current document from the server.
The browser can only show one document in its window: to display another document, it unloads the previous one. It is much more convenient to work in several browser windows at the same time. Opening a new window is done using the menu: File – New – Window (or the key combination Ctrl+N).

Working with a document

The browser allows you to perform a set of standard operations on a document. The web page loaded into it can be printed (in Internet Explorer this is done using the “Print” button or from the menu: File – Print...), saved to disk (menu: File – Save As...). You can find the piece of text you are interested in in the loaded page. To do this, use the menu: Edit – Find on this page.... And if you are interested in what it looks like this document in the original hypertext that the browser processed, select from the menu: View - As HTML.
When, while browsing the Internet, a user finds a page that is particularly interesting to him, he uses the ability provided in browsers to set bookmarks (similar to bookmarks that mark interesting parts of a book).
This is done through the menu: Favorites – Add to Favorites. After this, the new bookmark appears in the list of bookmarks, which can be viewed by clicking the “Favorites” button on the browser panel or through the Favorites menu.
Existing bookmarks can be deleted, edited, or organized into folders using the menu: Favorites – Organize favorites.

Working through a proxy server

  • Semantic Web involves improving the coherence and relevance of information on the World Wide Web through the introduction of new metadata formats.
  • Social Web relies on the work of organizing the information available on the Web, performed by the Web users themselves. In the second direction, developments that are part of the semantic web are actively used as tools (RSS and other web channel formats, OPML, XHTML microformats). Partially semanticized sections of the Wikipedia Category Tree help users consciously navigate the information space, however, very soft requirements for subcategories do not give reason to hope for the expansion of such sections. In this regard, attempts to compile knowledge atlases may be of interest.

There is also a popular concept Web 2.0, which summarizes several directions of development of the World Wide Web.

Web 2.0

The development of the WWW has recently been significantly carried out through the active introduction of new principles and technologies, collectively called Web 2.0 (Web 2.0). The term Web 2.0 itself first appeared in 2004 and is intended to illustrate the qualitative changes in the WWW in the second decade of its existence. Web 2.0 is a logical improvement of the Web. The main feature is the improvement and acceleration of the interaction of websites with users, which has led to a rapid increase in user activity. This showed up in:

  • participation in Internet communities (in particular, in forums);
  • posting comments on websites;
  • maintaining personal journals (blogs);
  • placing links on the WWW.

Web 2.0 introduced active data exchange, in particular:

  • export news between sites;
  • active aggregation of information from websites.
  • using an API to separate site data from the site itself

From the point of view of website implementation, Web 2.0 increases the requirements for the simplicity and convenience of websites for ordinary users and is aimed at a rapid decline in user qualifications in the near future. Compliance with the list of standards and consensuses (W3C) is brought to the fore. This is in particular:

  • standards for the visual design and functionality of websites;
  • standard requirements (SEO) of search engines;
  • XML and open information exchange standards.

On the other hand, Web 2.0 has dropped:

  • requirements for “brightness” and “creativity” of design and content;
  • needs for comprehensive websites (portals);
  • the importance of offline advertising;
  • business interest in large projects.

Thus, Web 2.0 recorded the transition of the WWW from single, expensive complex solutions to highly typed, cheap, easy-to-use sites with the ability to effectively exchange information. The main reasons for this transition were:

  • critical lack of quality information content;
  • the need for active self-expression of the user on the WWW;
  • development of technologies for searching and aggregating information on the WWW.

The transition to a set of Web 2.0 technologies has the following consequences for the global WWW information space, such as:

  • the success of the project is determined by the level of active communication between project users and the level of quality of information content;
  • websites can achieve high performance and profitability without large investments due to successful positioning on the WWW;
  • individual WWW users can achieve significant success in implementing their business and creative plans on the WWW without having their own websites;
  • the concept of a personal website is inferior to the concept of “blog”, “author’s column”;
  • fundamentally new roles for active WWW users appear (forum moderator, authoritative forum participant, blogger).

Web 2.0 Examples
Here are a few examples of sites that illustrate Web 2.0 technologies and that have actually changed the WWW environment. This is in particular:

In addition to these projects, there are other projects that shape the modern global environment and are based on the activity of their users. Sites, the content and popularity of which are formed, first of all, not by the efforts and resources of their owners, but by a community of users interested in the development of the site, constitute a new class of services that determine the rules of the global WWW environment.

FTP

FTP

FTP(English) File Transfer Protocol- file transfer protocol) is a standard protocol designed for transferring files over TCP networks (for example, the Internet). FTP is often used to download web pages and other documents from a private development device to public hosting servers.

The protocol is built on a client-server architecture and uses different network connections to transfer commands and data between client and server. FTP users can authenticate by passing a username and password in clear text, or, if the server allows it, they can connect anonymously (this access method is often considered more secure because it does not expose users' passwords to interception). You can use the SSH protocol for secure transfers that hide (encrypt) the login and password and also encrypt the content.

The first FTP client applications were interactive tools command line, implementing standard commands and syntax. Graphical user interfaces have since been developed for many operating systems still in use today. These interfaces include both general web design programs like Microsoft Expression Web and specialized ones (for example, CuteFTP).

FTP is one of the oldest application protocols, appearing long before HTTP, and even before TCP/IP, in 1971. It is still widely used today for software distribution and access to remote hosts.

FTP differs from other applications in that it uses two TCP connections to transfer the file:

  • Control connection- a connection for sending commands to the server and receiving responses from it. The control channel uses the Telnet protocol.
  • Data connection- connection for file transfer.

Story

The first implementation of the protocol (1971) provided for the exchange between the client and server of messages consisting of a header (72 bits) and variable length data. The message header included the request to or response from it, the type and length of the data transmitted. Request parameters (for example, path and file name), information from the server (for example, a list of files in a directory) and the files themselves were transmitted as data. Thus, commands and data were transmitted over the same channel.

In 1972, the protocol was completely changed and took on a form close to the modern one. Commands with parameters from the client and server responses are transmitted over a TELNET connection (control channel); a separate connection (data channel) is created for data transfer.

In subsequent editions, the ability to work in passive mode, transfer files between FTP servers was added, commands were introduced to obtain information, change the current directory, create and delete directories, and save files under a unique name. For some time there were commands for sending email via FTP, but they were later removed from the protocol.

In 1980, the FTP protocol began to use TCP. The latest version of the protocol was released in 1985. In 1997, an addition to the protocol appeared that allows information to be encrypted and signed in the control channel and data channel. In 1999, an addendum dedicated to protocol internationalization was released, which recommended the use of UTF-8 encoding for server commands and responses and defined a new LANG command that sets the response language.

Protocol Description

Difference from HTTP

Property FTP HTTP
Based on work sessions Yes No
Built-in user authentication Yes No
Mainly intended for transmission Large binary files Small text files
Connection model Dual connection Single connection
Mainly adapted for receiving/transmitting Reception and transmission Reception
Supports text and binary transmission modes Yes No
Supports specifying types of transmitted data (MIME headers) No Yes
Supports operations on file system(mkdir, rm, rename, etc.) Yes No

A rather striking feature of the FTP protocol is that it uses multiple (at least double) connections. In this case, one channel is the control channel, through which commands are sent to the server and its responses are returned (usually via TCP port 21), and through the rest the actual data transmission occurs, one channel for each transmission. Therefore, within one session via the FTP protocol, you can transfer several files simultaneously, and in both directions. For each data channel, its own TCP port is opened, the number of which is selected either by the server or the client, depending on the transmission mode.
has a binary transfer mode, which reduces traffic overhead and reduces data exchange time when transferring large files. The HTTP protocol necessarily requires encoding binary information into text form, for example using the Base64 algorithm.
When starting work via the FTP protocol, the client enters a session, and all operations are carried out within the framework of this session (in other words, the server remembers the current state). The HTTP protocol does not "remember" anything. Its task is to give the data and forget it, so remembering the state when using HTTP is carried out by methods external to the protocol.
FTP operates at the application layer of the OSI model and is used to transfer files using TCP/IP. To do this, an FTP server must be running and awaiting incoming requests. The client computer can contact the server on port 21. This connection (control flow) remains open for the duration of the session. The second connection (data stream) can be opened by both the server from port 20 to the corresponding client port ( active mode), or by the client from any port to the port of the corresponding server ( passive mode), which is necessary to transfer the data file. Control flow is used to operate the session - for example, the exchange of commands and passwords between the client and server using a telnet-like protocol. For example, "RETR filename" will transfer the specified file from the server to the client. Because of this two-port structure, FTP is considered an out-of-band protocol, as opposed to in-band HTTP.

Connection and data transfer

Web browser support

Most regular web browsers can retrieve files located on FTP servers, although they may not support protocol extensions like FTPS. When an FTP address is specified rather than an HTTP address, the available content on the remote server is presented similarly to other web content. A fully functional FTP client can be run in Firefox as an extension of FireFTP/

Syntax

The FTP URL syntax is described in RFC1738, in the form: ftp://[<пользователь>[:<пароль>]@]<хост>[:<порт>]/ <путь>(parameters in square brackets optional). For example:
ftp://public.ftp-servers.example.com/mydirectory/myfile.txt

More details about specifying the username and password are written in the browser documentation. By default, most web browsers use passive (PASV) mode, which better bypasses end-user firewalls.

Safety

FTP was not designed to be a secure protocol (especially by today's standards) and has numerous security vulnerabilities. In May 1999, the authors of RFC 2577 summarized the vulnerabilities into the following list of issues:

  • Hidden attacks (bounce attacks)
  • Spoof attacks
  • Brute force attacks
  • Packet capture, sniffing
  • Username protection
  • Port stealing

FTP cannot encrypt its traffic, all transmissions are cleartext, so usernames, passwords, commands and data can be read by anyone able to intercept the packet over the network. This problem is typical for many Internet Protocol specifications (including SMTP, Telnet, POP, IMAP) developed before the creation of encryption mechanisms such as TLS and SSL. The usual solution to this problem is to use "secure", TLS-protected versions of the vulnerable protocols (FTPS for FTP, TelnetS for Telnet, etc.) or another, more secure protocol such as SFTP/SCP provided with most Secure Shell protocol implementations .

Secure FTP

There are several methods for securely transferring files, at one time or another called "Secure FTP".

FTPS

Explicit FTPS is an extension to the FTP standard that allows clients to require that the FTP session be encrypted. This is implemented by sending the "AUTH TLS" command. The server has the ability to allow or reject connections that do not request TLS. This protocol extension is defined in the . Implicit FTPS is a legacy standard for FTP that requires an SSL or TLS connection. This standard was supposed to use different ports from normal FTP.

SFTP

SFTP, or "SSH File Transfer Protocol", is not related to FTP, except that it also transfers files and has a similar set of commands for users. SFTP, or secure FTP, is a program that uses SSH (Secure Shell) to transfer files. Unlike standard FTP, it encrypts both commands and data, preventing passwords and sensitive information from being transmitted openly over the network. SFTP is similar in functionality to FTP, but because it uses a different protocol, standard FTP clients cannot communicate with an SFTP server and vice versa.

FTP over SSH (not SFTP)

FTP over SSH (not SFTP) refers to the practice of tunneling a regular FTP session over an SSH connection. Because FTP uses multiple TCP connections, tunneling over SSH is especially difficult. When many SSH clients try to establish a tunnel for the control channel (the original client-server connection on port 21), only this channel will be protected; when transferring data, the FTP software on either end will establish new TCP connections (data channels) that will bypass the SSH connection and thus lose integrity.

Otherwise, for the client software SSH requires some knowledge of FTP to monitor and rewrite FTP control flow messages and autonomously open new redirections for the FTP data flow.

FTP over SSH is sometimes referred to as secure FTP; but it should not be confused with other methods such as SSL/TLS (FTPS). Other file transfer methods using SSH and not related to FTP are SFTP and SCP; in each of them, both credentials and file data are always protected by the SSH protocol.

FTP. Basic Concepts

FTP

  • user - user name.
  • colon is the program separator between username and password
  • password - password.
  • @ - means separation between user data and address.

Next comes the address itself. This can be an IP, or the address can have a literal value (ftp.ur.ru). After the address there is again a separating colon, which separates the address and the port number to which you should connect. By default, this port is 21, but can be any number designated by the server administrator.

The address might look like this:

This will mean that the username is anonymous, the password is the E-mail address, and the port is 21.

FTP modes

When working via the FTP protocol, two connections are established between and - manager(commands follow it) and data connection(files are transferred via it). The control connection is the same for active And passive mode. The client initiates a TCP connection from the dynamic port (1024-65535) to port number 21 on the FTP server and says "Hi! I want to connect to you. Here is my name and my password." Further actions depend on which FTP mode (active or passive) is selected.

  • IN active mode when the client says "Hello!" it also tells the server the port number (from dynamic range 1024-65535) so that the server can connect to the client to establish a data connection. The FTP server connects to the specified client port number, using TCP port number 20 for data transfer. For the client, such a connection is incoming, so working in active mode with clients located behind a firewall or NAT is often difficult or requires additional settings.
  • IN passive mode, after the client says "Hello!", the server tells the client the TCP port number (from the dynamic range 1024-65535) to which it can connect to establish a data connection. In this case, as is easy to see, the ports in such a connection, both on the client side and on the server side, turn out to be arbitrary. In passive mode, the client can easily work with the server through its firewall, but often in order for the server to support passive mode, the firewall must be configured accordingly on the server side.

The main difference between active FTP mode and passive FTP mode is the side that opens the connection for data transfer. In active mode, the client must be able to accept this connection from the FTP server. In passive mode, the client always initiates this connection itself, and the server must accept it.

FTP is a service based solely on TCP (Transmission Control Protocol). FTP is unusual in that it uses two ports, a "data" port and a "command" port (also known as a control port). Traditionally this is port 21 for commands and port 20 for data. However, depending on the mode, the data port will not always be 20.

In active FTP mode, the client connects from an arbitrary unprivileged port (N > 1024) to FTP server The new command port is 21. Then, the client starts listening on port N+1 and sends the FTP command PORT N+1 to the FTP server. In response, the server connects to the specified client data port from its local data port 20.

In passive FTP mode, the client initiates both connections to the server, solving the problem with firewalls that filter the client's incoming data port. When opening an FTP connection, the client locally opens two unprivileged ports (N > 1024 and N+1). The first port contacts the server on port 21, but instead of then issuing a PORT command and allowing the server to respond by connecting to its data port, the client issues a PASV command. As a result, the server opens an arbitrary unprivileged port (P > 1024) and sends the PORT P command to the client. Then, to transfer data, the client initiates a connection from port N+1 to port P on the server.

FTP server

FTP server

FTP server- a computer that contains publicly accessible files and is configured to support (the FTP server must have software that supports the FTP protocol).

There are currently three types of FTP servers on the Internet:

  1. Internet-style (access to all server files)
  2. Listserver (limited access)
  3. FTPmail (access via email).

Servers ftpmail are most interesting for those users who have very limited access to the Internet, that is, they can only use email. You enter several special commands in your letter that the FTPmail server you have chosen must execute. If everything is entered correctly and your letter has arrived as intended, the FTPmail server will start looking for the required file in almost all nooks and crannies of the Internet. If the file is found, it will be sent to you, otherwise you will receive a letter with information that this file does not exist in nature. The thing is, of course, good, but if you full access to Internet resources, it is of no use to you.

There are ways to store large amounts of data on the Internet. FTP servers. An FTP server is a kind of file library. To transfer files between FTP servers and the user’s computer, the protocol ( File Transfer Protocol- file transfer protocol).

What is an FTP server for? You can download files posted on numerous FTP servers to your computer. There are thousands of FTP servers on the Internet that provide free anonymous access to gigabytes of a wide variety of information: text documents, software distributions, photographs and music files. You can upload your home pages to free servers that provide space for them. This is much more convenient than using HTTP, when on a special server page you indicate the files that need to be downloaded.

There is also FileZilla Server- a project related to FileZilla Client. This is an FTP server developed by the same organization. It supports FTP, SFTP and FTPS (FTP over SSL/TLS).

Creating and setting up an FTP server using the example of FileZilla Server

Creating your own home FTP server allows you to organize local or global network convenient way to transfer data. To run it at home, you can use free software, for example, FileZilla Server. This program is equipped with all the necessary functionality and is easy to customize.

FileZilla Server is distributed by free license, therefore the program distribution can be freely downloaded from the website of its developer. Before installation, you must specify the port for listening on the administrator interface and determine the method for starting the FTP service. If you leave the default settings, the installer will select a random port and add the FTP service to Windows startup.

Also, before installing FileZilla Server, you must select a method for starting the server when the system boots. By default, the automatic start of the FTP service is activated for all users when they are authorized in the OS.

Once the installation is complete, the program will place its icon in the tray, and when you click on it, the server administration panel will open. In it, first of all, you should confirm the choice of server 127.0.0.1 and the specified port, and also, if necessary, create and enter an administrator password.

You should start setting up FileZilla Server by creating one or more users and allowing them access to certain directories on the computer. To do this, select the “Users” item in the “Edit” menu and click the “Add” button. In the window that appears, you need to enter an arbitrary user name, if desired, placing it in a specific group (you can create it in the “Edit - Groups” menu). After clicking “Ok” an account will be created with given name, after which you can start setting it up.

By default, a new FileZilla Server user is created without a password. To set it, you should check the “Password” box in “General” and enter it. In the same window, you can set restrictions on the number of connections for the selected user (0 - no restrictions).

In the “Shared folders” tab, you need to add the user’s root directory and select the directories to which he will have access. You can set the selected directory as the root directory by clicking the “Set as home dir” button. Also in this window you can specify the rights for the selected user to the files and directories available to him. For example, checking the “Write” and “Delete” checkboxes in the “Files” category will give the anonymous account rights to write and delete files in the “C:\FTP” directory.

The “Speed ​​Limit” tab is responsible for setting the speed limit for uploading and downloading data for a specific account. These settings can be left unchanged.

In the “IP filter” window, the administrator can deny access to the FTP server from certain IPs or subnets. This may be useful in the future in detecting careless users uploading illegal content to the server or causing inconvenience in other ways.

You can access the general server settings, which apply to all accounts, from the “Edit - Settings” menu. Most parameters, in particular, speed limits, IP blacklist, SSL and Autoban, can initially be left as they are. It is worth paying attention to the “Passive mode settings” item, which allows you to enter the server’s domain name instead of IP. This will be useful if the address is dynamic and changes every time you connect to the network.

A free domain name can be registered, for example, using the DynDNS service.

To enable users to communicate with an FTP server, you need to provide them with its address and account login information. All their actions will be displayed in the main FileZilla window.

FTP client

FTP client

FTP client- File Transfer Protocol (literally “file transfer protocol”) - a program to simplify access to . Depending on the purpose, it can either provide the user with simple access to a remote FTP server in text console mode, taking on only the work of forwarding user commands and files, or display files on the remote server as if they were part of the file system of the user’s computer, or both. In the last two cases, the FTP client takes on the task of interpreting user actions into commands, thereby making it possible to use the file transfer protocol without becoming familiar with all its intricacies.

Specific examples of using an FTP client could be:

  • Publishing website pages on an Internet server by a web developer
  • Downloading music, programs and any other data files ordinary user Internet. This example is often not even recognized by many users as the use of an FTP client and protocol, since many public servers do not request additional data to authenticate users, and Internet browsers (also FTP clients) download files without additional questions.

Implementation

In the simplest case for the user (but at the same time the most complex) the FTP client is a file system emulator that is simply located on another computer. With this file system, you can perform all the usual user actions: copy files from the server and to the server, delete files, create new files. In some cases, it is also possible to open files - for viewing, launching programs, editing. It is only necessary to take into account that opening a file implies its preliminary downloading to the user’s computer. Examples of such programs include:

  • Internet browsers (often run in read-only mode, meaning they do not allow files to be added to the server)
  • Many file managers is ftp.exe. Many Linux builds also have an ftp utility.

    Access rights and authorization

    The file system on a remote server usually has access rights settings for different users. For example, anonymous users may have access to only some files; users will not know about the existence of others. Another group of users may have access to other files or, for example, in addition to the rights to read files, they may also be given rights to write new or update existing files. The range of access rights options depends on the operating system and software of each specific FTP server. As a rule, they separate the rights to view the contents of a folder (that is, the ability to get a list of files contained in it), to read file(s), to write (create, delete, update) file(s)

    SmartFTP

    On the Internet, this client can be found at www.smartftp.com. The size of the distribution package of this program is approximately three to six megabytes. In the list of “regalia” (or rather, capabilities) of the program, the authors indicate the following points: TSL/SSL support, IPv6 support, on-the-fly data compression, UTF-8 support, the ability to transfer files directly between two servers, remote file editing, built-in download scheduler, creation tool backup copies, support for working from the command line and other functions that are more or less standard for FTP clients. The program's interface is convenient, beautiful and quite ordinary.

Let's consider the main Internet resources (services). The most popular resource is the World Wide Web or WWW, which represents a huge number (over a billion) of multimedia documents, the distinctive feature of which is the ability to link to each other. This means the presence in the current document of a link that implements the transition to any WWW document that can physically be located on another computer on the Internet. Using special WWW document viewing programs, an Internet user can quickly navigate through links from one document to another, traveling across the World Wide Web.

WWW (World Wide Web, World Wide Web) — a set of interconnected hypermedia documents.

Entire libraries of files are located on the Internet, access to which is provided by the FTP service.

FTP (File Transfer Protocol) - storage and transfer system for all kinds of files.

As mentioned above, in the beginning, the computer network was heavily used to quickly send text messages. Therefore, the oldest resource on the Internet is E-mail (electronic mail).

E-mail (e-mail) — email forwarding system.

There is a special service on the Internet that allows you to post messages on interconnected computers for the exchange of opinions. By connecting to one of these computers and selecting a discussion group (teleconference) based on your interests, you can read posted messages, ask a question to the group, or answer someone else's question. Messages are usually quickly replicated to other computers and stored for a short period of time, which is why this resource is called Newsgroups.

Newsgroups (newsgroups) — a global distributed system for exchanging messages and conducting discussions.

One of the most popular systems of this kind is Usenet newsgroups.

The Telnet service allows you to connect to a remote computer and work with its resources.

Telnet - service for remote control computers.

However, most often such computers operate under one or another variant operating system Unix (Unix), so currently this service is used primarily by network administrators.

Finally, the Internet has the IRC (Chat) system, which allows users to communicate in real time by entering text from the keyboard.

IRC (Internet Relay Chat, conversation over the Internet) - a service for communication between Internet users in real time by entering text from the keyboard.

The Internet can be used in various areas:

- professional activity;

- commercial activity;

— receiving educational services;

- recreation and entertainment.

In the field of professional activities on the Internet, you can search for information on topics of interest, organize joint projects with specialized companies.




Top