Key performance indicators (KPI) measure the success of a migration. They are suitable for reporting because they are quantifiable and less biased. KPI are taken before, during, and after to determine the outcome. These metrics are specific to technology migrations and help preserve domain authority.
The click-through rate (CTR) is the number of clicks over the number of impressions. The data comes from the search consoles for Google and Bing. Social platforms have similar metrics called engagement. The relevant engagements are for links back to pages and images on the website. Tracking CTR will detect breaks in how the website integrates with the rest of the internet.
The referrer site identifies the source for user sessions. It counts the number of clicks from search engines, social platforms, backlinks, and direct connections. They link the referrer to a specific page. For sites generating more website traffic, get an account on the source platform to track the click-through rates.
There are many types of demographics applied to website analysis. Geolocation of the IP address guides the selection of hosting sites. Screen size and type of device helps when testing and optimizing the user interface. Language settings, user location and time of day provide content development hints. Social platforms have richer user data including age, gender, and interests. All these sources feed into website planning.
A session is when a user loads one or more pages. Getting quality session data requires filtering out bot traffic. Conventional filters include accessing pages too quickly, not providing correct browser headers, and not loading the other URLs within the page.
A user session ends after a period of inactivity. If a new session starts with an existing cookie, it is a return session. A small percentage of browsers do not accept cookies, and the user can purge them. However, changes in the percent of return traffic is a critical SEO metric.
The pages per session track how the user navigates the site. A similar concept is the bounce rate, which is the percentage of users that enter and then quickly exit. They are both critical when measuring the engagement and impact the cost per click with advertisers.
A conversion is when a user progresses through the website sales funnel. Examples include warm sales leads and purchasing a product. Conversions are backtracked through the pages per session to reconstruct what works. Similarly, reconstructing non-conversions clarifies what does not work.
The response time is the duration to load a web page. It includes all the synchronously loaded URLs on the page. Measuring it as experienced by the user is possible but complex and rarely done. Instead, testing ensures meeting response times requirements while the site is under full load and from any client location.
The recovery time objective is how long it takes to resume operations after a failure. Current development and operations management (DevOps) automate the detection of failures as well as the launching of recovery programs. Cloud services extend that capability to include server hardware and data centers. For most websites, restorations from planned scenarios take minutes. Those scenarios include ransomware, human errors, software faults, hardware failure, and data center outages.
The error rate is the number of issues logged by the system, web server, database, router, and other subsystems. The weblogs for URL redirects, page not found, and other failures require attention before impacting additional users and search bot detection. Fixing issues before migration simplify analysis for events happening afterward.
Code efficiency is a measure of the hardware resources required by the website. Practical solutions utilize fewer resources, which reduces operational costs. The resource overhead between sites can vary by over 50 times. Efficiency has other benefits, such as removing complexity, which makes the solution easier to maintain.
The search index is a list of URLs from the website owner for search engines. It is critical to preserve their ranking. Extract the URL from the Google and Bing consoles along with there impressions and click-through rates. Check each one exists on the website before migration. Develop a redirection plan for any URL changes. Track the impressions and click-through rate to ensure search engines accept the updates.
A sitemap is a list of URLs for search bots. It is a promise to the search engine that a URL will be there when a user clicks on the search results. Reneging on the promise degrades domain authority. A sitemap speeds up indexing for new sites, when there are a large number of URLs, and when the URL is not reachable through HTML. The search engine still works without a sitemap, but it's less confident of the result, and that degrades page ranking.
A redirection funnel is a collection of URLs that point to the conical URL. It supports usability, name standardization, and domain transfers. Search indices update automatically provided the redirection returns a permanent HTTP 301 code and not a temporary 302. Backlink updates typically require manual communication so they can update their links.
The robot file contains a list of URLs the search engine should not browse. It prevents adding, deleting, and moving search indices. Check it before the migration to ensure it does not block updates and permits page rank transfers. The following code in the /robots.txt allows search bits to access all URLs.
The page hit rate is the maximal number of web page requests per minute. It is crucial to server sizing. Server sizing becomes more of an issue with dynamically generated web pages, larger payloads per page, and greater network distances.
Increasing the number of pages extends the time taken to create and update content. The number of pages can also affect server performance. The following chart shows the distribution of web pages per site.
The integration of technologies increases complexity. The most straightforward approach is a standalone installation, although that has become less common. Modern websites should track a range of business and technical metrics, such as the KPI mentioned at the top of the page. More sophisticated solutions integrate with applications for analytics, order management, customer relationship management, and other functions. Designs should avoid synchronously getting data from a second application because that complicates operations management. Instead, the feeds should be asynchronous or batch-oriented.
Downloading content through the website interface is the easiest approach. The following command uses the open-source utility wget, which has a release for Windows, Mac, and Linux. It puts the content to the working directory.
--mirror to recurse the entire site. --convert-links updates the hyperlinks within the page for local viewing. --adjust-extension adds the .html file extension when none is present. It helps with local file viewing of the website. --page-requisites downloads the ancillary files.
Downloading this site takes a second. However, allow fifteen seconds per page to get 95% of websites. If there are issues, see if the download is still in progress. Adding the following options may speed it up.
Add --no-parent to prevent transversing up the directory structure.
Use --reject "php,sh,avi,xml" to remove those files from the download.
Viewing the website from a local file system simplifies development. Point the web browser to the index.html file in the download directory from the above command. See the URL file://<absolute path to directory>index.html in the following image as an example. Our tests found one in 20 sites do not display correctly. In these cases, the issue was a plugin for image sliders. The recommended approach is to remove that code as it also prevents search bots from detecting that content.
The upload technology varies between hosting platforms. The Strategic Mind Solution provides secure transfer with multi-factor authentication and replication across geographically dispersed data centers.