• ClipSaver
ClipSaver
РусскиС Π²ΠΈΠ΄Π΅ΠΎ
  • Π‘ΠΌΠ΅ΡˆΠ½Ρ‹Π΅ Π²ΠΈΠ΄Π΅ΠΎ
  • ΠŸΡ€ΠΈΠΊΠΎΠ»Ρ‹
  • ΠžΠ±Π·ΠΎΡ€Ρ‹
  • Новости
  • ВСсты
  • Π‘ΠΏΠΎΡ€Ρ‚
  • Π›ΡŽΠ±ΠΎΠ²ΡŒ
  • ΠœΡƒΠ·Ρ‹ΠΊΠ°
  • Π Π°Π·Π½ΠΎΠ΅
БСйчас Π² Ρ‚Ρ€Π΅Π½Π΄Π΅
  • Π€Π΅ΠΉΠ³ΠΈΠ½ Π»Π°ΠΉΡ„
  • Π’Ρ€ΠΈ ΠΊΠΎΡ‚Π°
  • Π‘Π°ΠΌΠ²Π΅Π» адамян
  • А4 ΡŽΡ‚ΡƒΠ±
  • ΡΠΊΠ°Ρ‡Π°Ρ‚ΡŒ Π±ΠΈΡ‚
  • Π³ΠΈΡ‚Π°Ρ€Π° с нуля
Π˜Π½ΠΎΡΡ‚Ρ€Π°Π½Π½Ρ‹Π΅ Π²ΠΈΠ΄Π΅ΠΎ
  • Funny Babies
  • Funny Sports
  • Funny Animals
  • Funny Pranks
  • Funny Magic
  • Funny Vines
  • Funny Virals
  • Funny K-Pop

Non-Euclidean brains ΡΠΊΠ°Ρ‡Π°Ρ‚ΡŒ Π² Ρ…ΠΎΡ€ΠΎΡˆΠ΅ΠΌ качСствС

Non-Euclidean brains 10 months ago

connectome

non-Euclidean geometry

visualization

artificial intelligence

НС удаСтся Π·Π°Π³Ρ€ΡƒΠ·ΠΈΡ‚ΡŒ Youtube-ΠΏΠ»Π΅Π΅Ρ€. ΠŸΡ€ΠΎΠ²Π΅Ρ€ΡŒΡ‚Π΅ Π±Π»ΠΎΠΊΠΈΡ€ΠΎΠ²ΠΊΡƒ Youtube Π² вашСй сСти.
ΠŸΠΎΠ²Ρ‚ΠΎΡ€ΡΠ΅ΠΌ ΠΏΠΎΠΏΡ‹Ρ‚ΠΊΡƒ...
Non-Euclidean brains
  • ΠŸΠΎΠ΄Π΅Π»ΠΈΡ‚ΡŒΡΡ Π’Πš
  • ΠŸΠΎΠ΄Π΅Π»ΠΈΡ‚ΡŒΡΡ Π² ОК
  •  
  •  


Π‘ΠΊΠ°Ρ‡Π°Ρ‚ΡŒ Π²ΠΈΠ΄Π΅ΠΎ с ΡŽΡ‚ΡƒΠ± ΠΏΠΎ ссылкС ΠΈΠ»ΠΈ ΡΠΌΠΎΡ‚Ρ€Π΅Ρ‚ΡŒ Π±Π΅Π· Π±Π»ΠΎΠΊΠΈΡ€ΠΎΠ²ΠΎΠΊ Π½Π° сайтС: Non-Euclidean brains Π² качСствС 4k

Π£ нас Π²Ρ‹ ΠΌΠΎΠΆΠ΅Ρ‚Π΅ ΠΏΠΎΡΠΌΠΎΡ‚Ρ€Π΅Ρ‚ΡŒ бСсплатно Non-Euclidean brains ΠΈΠ»ΠΈ ΡΠΊΠ°Ρ‡Π°Ρ‚ΡŒ Π² максимальном доступном качСствС, Π²ΠΈΠ΄Π΅ΠΎ ΠΊΠΎΡ‚ΠΎΡ€ΠΎΠ΅ Π±Ρ‹Π»ΠΎ Π·Π°Π³Ρ€ΡƒΠΆΠ΅Π½ΠΎ Π½Π° ΡŽΡ‚ΡƒΠ±. Для Π·Π°Π³Ρ€ΡƒΠ·ΠΊΠΈ Π²Ρ‹Π±Π΅Ρ€ΠΈΡ‚Π΅ Π²Π°Ρ€ΠΈΠ°Π½Ρ‚ ΠΈΠ· Ρ„ΠΎΡ€ΠΌΡ‹ Π½ΠΈΠΆΠ΅:

  • Π˜Π½Ρ„ΠΎΡ€ΠΌΠ°Ρ†ΠΈΡ ΠΏΠΎ Π·Π°Π³Ρ€ΡƒΠ·ΠΊΠ΅:

Π‘ΠΊΠ°Ρ‡Π°Ρ‚ΡŒ mp3 с ΡŽΡ‚ΡƒΠ±Π° ΠΎΡ‚Π΄Π΅Π»ΡŒΠ½Ρ‹ΠΌ Ρ„Π°ΠΉΠ»ΠΎΠΌ. БСсплатный Ρ€ΠΈΠ½Π³Ρ‚ΠΎΠ½ Non-Euclidean brains Π² Ρ„ΠΎΡ€ΠΌΠ°Ρ‚Π΅ MP3:


Если ΠΊΠ½ΠΎΠΏΠΊΠΈ скачивания Π½Π΅ Π·Π°Π³Ρ€ΡƒΠ·ΠΈΠ»ΠΈΡΡŒ ΠΠΠ–ΠœΠ˜Π’Π• Π—Π”Π•Π‘Π¬ ΠΈΠ»ΠΈ ΠΎΠ±Π½ΠΎΠ²ΠΈΡ‚Π΅ страницу
Если Π²ΠΎΠ·Π½ΠΈΠΊΠ°ΡŽΡ‚ ΠΏΡ€ΠΎΠ±Π»Π΅ΠΌΡ‹ со скачиваниСм Π²ΠΈΠ΄Π΅ΠΎ, поТалуйста Π½Π°ΠΏΠΈΡˆΠΈΡ‚Π΅ Π² ΠΏΠΎΠ΄Π΄Π΅Ρ€ΠΆΠΊΡƒ ΠΏΠΎ адрСсу Π²Π½ΠΈΠ·Ρƒ страницы.
Бпасибо Π·Π° использованиС сСрвиса ClipSaver.ru



Non-Euclidean brains

Finding suitable embeddings for connectomes (spatially embedded complex networks that map neural connections in the brain) is crucial for analyzing and understanding cognitive processes. Recent studies have found two-dimensional hyperbolic embeddings superior to Euclidean embeddings in modeling connectomes across species, especially human connectomes. However, those studies had limitations: geometries other than Euclidean, hyperbolic, or spherical were not considered. Following William Thurston's suggestion that the networks of neurons in the brain could be successfully represented in Solv geometry, we study the goodness-of-fit of the embeddings for 21 connectome networks (8 species). To this end, we suggest an embedding algorithm based on Simulating Annealing that allows us to embed connectomes to Euclidean, Spherical, Hyperbolic, Solv, Nil, and product geometries. Our algorithm tends to find better embeddings than the state-of-the-art, even in the hyperbolic case. Our findings suggest that while three-dimensional hyperbolic embeddings yield the best results in many cases, Solv embeddings perform reasonably well. This is a visualization accompanying our ECAI 2024 paper "Modelling brain connectomes networks: Solv is a worthy competitor to hyperbolic geometry!" (arXiv: http://arxiv.org/abs/2407.16077 ) Geometries are visualized as follows: Euclidean 3D -- obvious hyperbolic 3D -- PoincarΓ© ball (except first-person perspective for H3 manifold) Nil, Solv -- the screen XYZ coordinates correspond to the Lie logarithm of the point (in case of Nil, this is the same model as in "Nil geometry explained" -- the geodesic ball is longer along the 'Z' axis, in the visualization we rotate around the Y axis) H2xR -- azimuthal equidistant (the distance and direction from the center are mapped faithfully) Twist (twisted product of H2xR) -- each layer uses azimuthal equidistant projection Spherical 3D -- azimuthal equidistant projection hyperbolic 2D -- PoincarΓ© disk Edges are drawn as geodesics (except Solv). All nodes are drawn as balls of the same size (so their size and distortion can be to understand the scaling of the projection). Our embedder is based on the maximum likelihood method, assuming that the probability that two edges in distance d is connected is (independently) 1/(1+\exp((d-R)/T)). (I.e., the parameters R, T, and positions of nodes are placed in such a way that the probability of obtaining connections and non-connections like in the actual dataset is maximized.) :NLL (Normalized Log-likelihood), MAP, IMR (inverse MeanRank), SC (greedy success rate), and IST (inverse greedy stretch) are various quality measures from the literature, normalized to [0,1]. For every connectome, we show the geometries which are in top 3 according to some measure (according to the Copeland voting rule). Music: Somatic Cosmos by Timo Petmanson (petmanson) the Sphere by Jakub Steiner (jimmac) Lost Mountain by Lincoln Domina (HyperRogue soundtrack) YouTube compression is not great with such a visualization. Try selecting a higher quality in YouTube, or go here: https://drive.google.com/file/d/1kbWD... and download.

Comments
  • Hough Transform | Boundary Detection 4 years ago
    Hough Transform | Boundary Detection
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 4 years ago
    202614
  • But what is a neural network? | Deep learning chapter 1 7 years ago
    But what is a neural network? | Deep learning chapter 1
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 7 years ago
    19541382
  • How One Line in the Oldest Math Text Hinted at Hidden Universes 1 year ago
    How One Line in the Oldest Math Text Hinted at Hidden Universes
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 1 year ago
    14339160
  • Blender Tutorial for Complete Beginners - Part 1 1 year ago
    Blender Tutorial for Complete Beginners - Part 1
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 1 year ago
    8663396
  • ЯдСрная Π²ΠΎΠΉΠ½Π°: сцСнарий. Как тСхничСски ΠΏΡ€ΠΎΠΈΠ·ΠΎΠΉΠ΄Π΅Ρ‚ апокалипсис 2 days ago
    ЯдСрная Π²ΠΎΠΉΠ½Π°: сцСнарий. Как тСхничСски ΠΏΡ€ΠΎΠΈΠ·ΠΎΠΉΠ΄Π΅Ρ‚ апокалипсис
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 2 days ago
    984192
  • How might LLMs store facts | DL7 9 months ago
    How might LLMs store facts | DL7
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 9 months ago
    1402399
  • Gradient descent, how neural networks learn | DL2 7 years ago
    Gradient descent, how neural networks learn | DL2
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 7 years ago
    7818054
  • Как ΠΆΠΈΠ²ΡƒΡ‚ Ρ…Π°ΠΊΠ΅Ρ€Ρ‹? Π€Π‘Π‘, Π΄Π°Ρ€ΠΊΠ½Π΅Ρ‚ ΠΈ ΠΌΠΈΠ»Π»ΠΈΠΎΠ½Ρ‹ Π΄ΠΎΠ»Π»Π°Ρ€ΠΎΠ² |  Π˜ΡΡ‚ΠΎΡ€ΠΈΠΈ мошСнников, ΠΊΠΈΠ±Π΅Ρ€ΠΏΠ°Ρ€Ρ‚ΠΈΠ·Π°Π½ ΠΈ Π·Π΅Ρ‚Π½ΠΈΠΊΠΎΠ² 2 days ago
    Как ΠΆΠΈΠ²ΡƒΡ‚ Ρ…Π°ΠΊΠ΅Ρ€Ρ‹? Π€Π‘Π‘, Π΄Π°Ρ€ΠΊΠ½Π΅Ρ‚ ΠΈ ΠΌΠΈΠ»Π»ΠΈΠΎΠ½Ρ‹ Π΄ΠΎΠ»Π»Π°Ρ€ΠΎΠ² | Π˜ΡΡ‚ΠΎΡ€ΠΈΠΈ мошСнников, ΠΊΠΈΠ±Π΅Ρ€ΠΏΠ°Ρ€Ρ‚ΠΈΠ·Π°Π½ ΠΈ Π·Π΅Ρ‚Π½ΠΈΠΊΠΎΠ²
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 2 days ago
    337629
  • The Most Important Algorithm in Machine Learning 1 year ago
    The Most Important Algorithm in Machine Learning
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 1 year ago
    720422
  • Portals to Non-Euclidean Geometries 3 years ago
    Portals to Non-Euclidean Geometries
    ΠžΠΏΡƒΠ±Π»ΠΈΠΊΠΎΠ²Π°Π½ΠΎ: 3 years ago
    1480554

ΠšΠΎΠ½Ρ‚Π°ΠΊΡ‚Π½Ρ‹ΠΉ email для ΠΏΡ€Π°Π²ΠΎΠΎΠ±Π»Π°Π΄Π°Ρ‚Π΅Π»Π΅ΠΉ: [email protected] © 2017 - 2025

ΠžΡ‚ΠΊΠ°Π· ΠΎΡ‚ отвСтствСнности - Disclaimer ΠŸΡ€Π°Π²ΠΎΠΎΠ±Π»Π°Π΄Π°Ρ‚Π΅Π»ΡΠΌ - DMCA Условия использования сайта - TOS