One of the most beautiful mathematical facts I’ve come across is that the mathematical constant e (≈2.71828…) is exactly the sum of (1 over n factorial) for n = 0 to infinity.
Or put another way:
e = sum (1 / n!) for n = 0 ... ∞
Simple code can demonstrate this. See the code below 👇
def factorial(n): if n == 1 or n == 0: return 1 else: cur_factorial = 1 for n in range(2, n + 1): cur_factorial = n * cur_factorial
return cur_factorial
def calculate_e_with_infinite_sum(end_index = 100): cur_sum = 0 for n in range(0, end_index + 1): cur_n_factorial = factorial(n) cur_fraction = 1 / cur_n_factorial cur_sum = cur_sum + cur_fraction return cur_sum
I first started using Grok 4 with the xAI API yesterday. I'm quite impressed with Grok 4. I think the quality is similar to Grok 3's quality, and I look forward to using it more. #Grok#Grok4#AI#GenerativeAI#API#xAI
Check out my beginner-friendly Arch Linux install guide using archinstall! Perfect for those curious about Linux & tech. Data scientist here, breaking down complex ideas simply. Watch now 👇 https://youtu.be/XVCDvgQ5F8g
We finally got a long-form video with more than 500 views. It is a video about how to set up LibreTranslate, an open source alternative to Google Translate. Check it out below! 🙏 https://www.youtube.com/watch?v=JKpGD...
In my opinion, the mistral AI model, a 7 billion parameter model, available to download on ollama, is one of the best open source AI models you can use on a moderately powerful computer without a GPU. mistral is from the company called Mistral AI, which I have previously made a video about. I think the output quality from the mistral AI model is very good.
I love using models from ChatGPT, Anthropic, Google Gemini, and other companies, but usually I am using their own servers to run the models. Running AI models with ollama is great, but most good models need a powerful computer with (a) GPU(s).
Currently mistral is the 9th most popular AI model on ollama, with a total of 16.1M pulls.
After installing ollama, you can get the model with
ollama pull mistral
and you can run it with:
ollama run mistral
You can check out the mistral page on ollama below
Watson Tech World
I just wrote an article about what I consider to be the best way to learn coding. Check it out 👇
jprbarry.com/the-best-way-to-learn-coding/
1 month ago | [YT] | 0
View 0 replies
Watson Tech World
We recently passed 100 subscribers. Thank you to everyone who follows this channel!!! 🙏
1 month ago | [YT] | 1
View 0 replies
Watson Tech World
We finally got our first horizontal video with more than 1000 views. 🙏 #excited #yay #thanks
2 months ago | [YT] | 0
View 0 replies
Watson Tech World
Learn 30+ Linux commands in this guide 🐧. Perfect for devs, data scientists & sysadmins. Real examples to boost your CLI skills!
https://youtu.be/wtv9xcLYzTk
#Linux #Bash #CommandLine #DataScience #tech
5 months ago | [YT] | 0
View 0 replies
Watson Tech World
One of the most beautiful mathematical facts I’ve come across is that the mathematical constant e (≈2.71828…) is exactly the sum of (1 over n factorial) for n = 0 to infinity.
Or put another way:
e = sum (1 / n!) for n = 0 ... ∞
Simple code can demonstrate this. See the code below 👇
def factorial(n):
if n == 1 or n == 0:
return 1
else:
cur_factorial = 1
for n in range(2, n + 1):
cur_factorial = n * cur_factorial
return cur_factorial
def calculate_e_with_infinite_sum(end_index = 100):
cur_sum = 0
for n in range(0, end_index + 1):
cur_n_factorial = factorial(n)
cur_fraction = 1 / cur_n_factorial
cur_sum = cur_sum + cur_fraction
return cur_sum
calculate_e_with_infinite_sum()
# 2.7182818284590455
calculate_e_with_infinite_sum(end_index=10)
# 2.7182818011463845
calculate_e_with_infinite_sum(end_index=1)
# 2.0
#math #maths #learning #Python
5 months ago | [YT] | 0
View 0 replies
Watson Tech World
I first started using Grok 4 with the xAI API yesterday. I'm quite impressed with Grok 4. I think the quality is similar to Grok 3's quality, and I look forward to using it more. #Grok #Grok4 #AI #GenerativeAI #API #xAI
5 months ago | [YT] | 0
View 0 replies
Watson Tech World
Check out my beginner-friendly Arch Linux install guide using archinstall! Perfect for those curious about Linux & tech. Data scientist here, breaking down complex ideas simply. Watch now 👇
https://youtu.be/XVCDvgQ5F8g
#ArchLinux #DataScience #Linux
5 months ago | [YT] | 0
View 0 replies
Watson Tech World
We finally got a long-form video with more than 500 views. It is a video about how to set up LibreTranslate, an open source alternative to Google Translate. Check it out below! 🙏
https://www.youtube.com/watch?v=JKpGD...
#LibreTranslate #OpenSource #TranslationAPI #DigitalOcean #PythonTutorial #Python #OpenSourceTranslation #VPS #PythonAPI #language #languages #translation
5 months ago (edited) | [YT] | 0
View 0 replies
Watson Tech World
In my opinion, the mistral AI model, a 7 billion parameter model, available to download on ollama, is one of the best open source AI models you can use on a moderately powerful computer without a GPU. mistral is from the company called Mistral AI, which I have previously made a video about. I think the output quality from the mistral AI model is very good.
I love using models from ChatGPT, Anthropic, Google Gemini, and other companies, but usually I am using their own servers to run the models. Running AI models with ollama is great, but most good models need a powerful computer with (a) GPU(s).
Currently mistral is the 9th most popular AI model on ollama, with a total of 16.1M pulls.
After installing ollama, you can get the model with
ollama pull mistral
and you can run it with:
ollama run mistral
You can check out the mistral page on ollama below
ollama.com/library/mistral
#ollama #mistral #AI #chatbot #OpenSourceAI
5 months ago | [YT] | 0
View 0 replies
Watson Tech World
Our channel finally passed 10,000 views. Thank you!!! 🙏 #excited
7 months ago | [YT] | 0
View 0 replies
Load more