crypto news

AI Pseudo-EXperiment: Prevent all I use tools for one day

https://codedoods.substack.com/p/do-not-penhttps://codedoods.substack.com/p/do-not-pen

Hello beautiful people!

Welcome to the programming blog and brutality! In this article, I will give you the results of a false experience: Preventing all the artificial intelligence tools I use, however on a normal day (as a developer and technical writer).

Subscribe to receive it directly in your inbox: https://codedoods.substack.com/

Since ChatGPT and other Chatbots and artificial intelligence tools have been released, we have gradually lost the grip of our work. Use it to write and correct e -mail icons and even SMS, may have led to this scenario. Worse, I have seen leaflets stating “Research shows that the use of artificial intelligence reduces our ability to think critical.” This is what concerns me: Do I still have the skills I have, in 2020-22 before the Chatgpt version?

I cannot disagree: Artificial intelligence tools save a lot of time and enhance productivity. But at any cost? If we are slowly losing the ability to think and solve problems ourselves, we are destroying one of the greatest gifts for humanity.

However, reading this does not offend its understanding: I do not hate artificial intelligence. I think it is an amazing concept that programming brings, and we all use it regularly. To date, the grammatical rules supplement shows me ways to improve these sentences using artificial intelligence. The only thing I don’t like is the way we deal with them. Fire exchange, I have to write another article about this!

About experience

Returning to the context, I felt a bit disappointed in the amount of use of artificial intelligence to do my work I decided to ban every one -day Amnesty International tool. Fortunately, I haven’t downloaded any of these applications on my MacBook, so all you need to do is get an extension of the ban Chatgpt, Claude, Gemini, and Dibsic.

This was an article Written simultaneously with workAnd in the liberation stage, I realized that it looks a little … strange. It is the process of “thinking” for my absurd human drug, such as the cause of ChatGPT. I ask you not to judge me from this article.

Date: Feb 17th, 2025

The to-do list:

- Write a script to scrape Bing Search results (for a tutorial on building a scraper).

- Review the PR #6 on GitHub.

- Improve SEO of my personal site, chenuli.codedoodles.xyz (current SEO score was 85)

- If more time’s left, design a merch product.

(Such a list of small tasks, yes, I’m my manager)

Writing text

One of the easiest programs you can write is one of the easiest programs you can write and is often overlooked. I have written about 3-4 articles on different platforms as TIKTOK with APIIFY, but all it takes was directed or two on Chatgpt to write the text program:

Write a python script using selenium and chromedriver to scrape "Trending Videos" page on TikTok. 

Structure: Page consists of a card-like structure which consists of trending videos on TikTok. These are inside a <div> container with the class of tiktok-559e6k-DivItemContainer e1aajktk28 blah blah and blah

But this time is different. I will write the base of the text program first, test it, and improve it according to the needs.

I spent a few minutes to select the library to be used: theatrical writer, selenium or beautiful soup. Beutifulsoup looked like the easiest and simpler option, so I went with him.

import requests
from bs4 import BeautifulSoup

You should then write a head that mimics the real browser request, so that it is not blocked by protecting the robot (or Captcha). It is impossible for me to accurately write it, so I opened Chatgpt reflex. Destruct, yes, but it was banned for the better.

After a long time, I used Google to order a sample. What a savior.

headers={"User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/111.0.0.0 Safari/537.36"}

Then I have to create a search link like those that Bing uses. It should be similar to Google, I think.

Hmm, not badHmm, not bad

You must specify the chapter variables (do you call it though) to scrape them individually.

Bing does an elegant job when it comes to it. Results in the menu structure, <ol> With each element of the menu is the result of one search. This makes things much easier. I can scouts all these elements

  • From the page and set it to one variable, then using a loop, I can coordinate addresses and links clearly. salary! It is not a genius method, but I did not completely lose my ability to think, at least.

    completeData = soup.find_all("li",{"class":"b_algo"})
    

    Oh exchange of shooting, this is stupid! It only uses an unnecessary loop and cannot be reused with.

    We can only episode of the list of the elements itself. Let us add some error treatment as well.

     for item in soup.find_all("li", class_="b_algo"):
            title = item.find("h2")
            link = title.find("a")["href"] if title else ""
            results.append((title.text if title else "No title", link))
    
        if response.status_code != 200:
            print("Failed to retrieve search results")
            return []
    

    This is much better. Finally, we can add another episode to coordinate and print results accordingly.

    search_results = scrape_bing("Programming and Doodles")
    for title, link in search_results:
        print(f"{title}: {link}")
    

    Two sentences later, they work!

    I felt comfortable. Just like the old days.

    Public relations review on Gabbap

    This should not take long. It is just a long python text.

    Sweet colleague!Sweet colleague!

    Given public relations, the code did not interfere with operations. However, the elements are not concentrated like the original elements that they created. But this is the second time that I have reviewed this public relations and this contributor seems new. Asking for changes again will feel bad, I will fix them myself and give a good review.

    Code Code, he did a good job. But for some reason, it should put a sticky variable on the graphic user interface. Let me remove that.

    Hmm, still does not work. There should be something in TKINTER supporting concentration elements.

    I remember the keyword, but it is tired to add it to all the items. Most of the results appear on Google in the same way.

    Oh, I found one! We can only use GID_ColumNConfigure for that. Thank you, bitrake for Stackoveflow.

    PHEW, I can stay well without Amnesty International as developer in 2025.

    Improving senior economic officials

    I recently built a personal web (not a wallet), but the degree of senior economic officials has a little terrible.

    How to go to improve search engines (SEO) always uses organized data (planning coding), to obtain rich results. If you are not aware, it is a form of descriptive data that helps search engines to better understand your content, which strengthens search results features such as rich excerpts, common questions, and knowledge panels. If SEO is not for your own site, enabling rich results may make them 100.

    This is something I ask Chatgpt doing; I will write the content and ask him to use the appropriate sentence construction. It greatly reduces the time taken, but since it was banned, I thought I would write it myself.

    Or I can copy/paste the scheme coding from my main location, codedoodless.xyz, and change it. It is my code, however.

    What’s more, I can also add a common question section. But I remember I read that Google policy states that you cannot add signs of an invisible content on the website. It is clear that my website does not have a common question section.

    But this is good. I can still add a question like “Who is Chenuli Jayasinghe”, and the answer will be a summary of the content on the site. victory!

    It looks great. The result should rise after publication.

    Hmm, not bad but can go. Parents site, codedoods.xyz got 100, so should this be. This is the point that I will ask Chatgpt or Deepseek to get suggestions but wait, I have a feature – I can check the codedoods.xyz code to find out what makes it 100.

    OpenGRAP? finished.

    Twitter card? finished.

    Coding the scheme? Do it too.

    What next! Let me add more keywords after that.

    The same is still.

    Oh yes, I must miss the “alt” descriptions of the images.

    No, I also added it. Dang, how stupid I am? The Al -Manara report shows the same details.

    Robots.txt file is the problem. Use Codedoods.xyz and Chenuli.codedoses.xyz websites, so I should be able to do some copy functions. once again.

    Just like the old days.

    Yes! It’s 100 now.

    It should be. The Merch design has to do with AI Chatbots, and I have survived work. It’s time to go reading a book, a great developer!

    summary


    It was hypothetical for being a stupid, impotent, uncomfortable developer, and I am happy with that. Although my critical thinking has been affected (I have committed some stupid mistakes), I can still accomplish things. The entire process took longer than usual – especially writing this textual abstraction – but it felt more rewarding. Like finding your old bike in the garage and realizing that you did not forget how to ride it.

    Certainly, Chatgpt opened involuntarily several times, and yes, Googling felt and copying/attached to guilt. But hey, this is the way we coded before 2022, right? The credit and documentation was the best of our friends, and they are still working well.

    The main meals are that the use of artificial intelligence tools is completely good and understandable. But once from time to time, do what you just did: block those llms and try to do your own. You will feel satisfied, believe me.

    Please, do not be such a man on Radit. Use your mind.

  • Related Articles

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Back to top button

    Adblock Detected

    Please consider supporting us by disabling your ad blocker