How to Promote Your Blog for Free: 101 Ways to Increase Traffic

We will make T-NLRv5 and its capabilities available in the same way as with other Microsoft Turing models.
We will leverage its increased capabilities to further improve the execution of popular language tasks in Azure Cognitive Services. Customers will automatically benefit from these.

James Schramko podcast stats

Efficiently and effectively scaling up language model pretraining for best language representation model on GLUE and SuperGLUE

As part of Microsoft AI at Scale, the Turing family of NLP models are being used at scale across Microsoft to enable the next generation of AI experiences. Today, we are happy to announce that the latest Microsoft Turing model (T-NLRv5) is the state of the art at the top of SuperGLUE and GLUE leaderboards, further surpassing human performance and other models. Notably, T-NLRv5 first achieved human parity on MNLI and RTE on the GLUE benchmark, the last two GLUE tasks which human parity had not yet met. In addition, T-NLRv5 is more efficient than recent pretraining models, achieving comparable effectiveness with 50% fewer parameters and pretraining computing costs.

The Turing Natural Language Representation (T-NLRv5) integrates some of the best modeling techniques developed by Microsoft Research, Azure AI, and Microsoft Turing. The models are pretrained at large scale using an efficient training framework based on FastPT and DeepSpeed. We’re excited to bring new AI improvements to Microsoft products using these state-of-the-art techniques.

Model architecture and pretraining task

T-NLRv5 is largely based on our recent work, COCO-LM, a natural evolution of pretraining paradigm converging the benefits of ELECTRA-style models and corrective language model pretraining. As illustrated in Figure 2, T-NLRv5 employs an auxiliary transformer language model to corrupt an input text sequence, and the main transformer model is pretrained using the corrective language model task, which is to detect and correct tokens replaced by the auxiliary model. This augments the ELECTRA model family with language modeling capacity, bringing together the benefits from pretraining with adversarial signals generated from the auxiliary model and the language modeling capacity, which is handy for prompt-based learning.

We also leverage the training dataset and the data processing pipeline optimized for developing previous T-NLR releases, including DeBERTa and UniLM, as well as the implementation optimizations from other Microsoft pretraining research efforts, such as TUPE.

Another key property of T-NLRv5 is that it maintains the effectiveness of the model at smaller sizes, e.g., base and large size with a few hundred million parameters, to bigger sizes with billions of parameters. This is achieved by careful selection of techniques of maintaining model simplicity and optimization stability. We disabled dropout in the auxiliary model so that the pretraining of the auxiliary model and the generation of the main model’s training data are done in one pass. We also disabled the sequential contrastive learning task in COCO-LM to reduce computing cost. This enables us to stick to the post-layer norm transformer architecture that allows us to train deeper transformer networks more thoroughly.

How to promote your blog on other sites

1. Comment on other blogs

promote website and get visitors

The right comment on a highly authoritative blog can send you lots of traffic; identify the top blogs in your niche and start interacting and commenting on blogs on a regular basis. This will get the blogger and members of his community to notice you and visit your blog, leading to more traffic for you; you might even get a link, or invitation to guest blog out of it.

2. Develop a blogger outreach plan

For every article you publish, make sure you have a solid blogger outreach plan; compile a list of dozens of relevant bloggers that you can email your article to and ask them to share or link to your article. This article is a good place to learn how to do blogger outreach.

3. Look for opportunities to be included in link roundups

Every niche has at least a dozen blogs that do weekly or monthly roundups; at the end of the week, or month, these blogs will publish an article with a summary of the most interesting content they read in a particular month. You can get some nice exposure by emailing these bloggers and asking them to include your article in their next roundup.

4. Enable trackbacks on your blog

Anytime you publish a new article on a blog hosted with WordPress, it gives you an option to enable comments and trackbacks; a “trackback” is an automatic notification sent to any website you link to, allowing your link to be featured in their comments section as well if approved.

If you link to top blogs in your article and you and the blogs you linked to both have trackbacks enabled, a link to your article will appear in their comments section; this will ensure people seeing comments on their blog can see a link to your articles, and it’ll eventually lead to more traffic for you.

5. Submit your content to blogging communities

Blogging communities have existed for a long time and are a good place to get traffic; some notable examples are Bizsugar in the business niche, Blokube in the blogging and social media niche and Dribbble in the design niche. Look for a blogging community in your niche that you can share interesting articles with and occasionally submit your articles.

Authorship:

https://www.microsoft.com/en-us/research/blog/efficiently-and-effectively-scaling-up-language-model-pretraining-for-best-language-representation-model-on-glue-and-superglue/
https://startbloggingonline.com/how-to-promote-your-blog-and-get-visitors/